B.F. Skinner Timeline
|Full Name||Burrhus Frederic Skinner (B.F. Skinner)|
|Place and Date of Birth||March 20, 1904; Susquehanna, a small town in the hills of Pennsylvania.|
|Place and Date of Death||August 18, 1990, Cambridge, Massachusetts|
|B.A. Degree (Graduation)||Hamilton College, Clinton, New York, in 1926|
|M.A. Degree||1930, Harvard University|
|Ph.D.||1931, Harvard University|
|National Research Council Fellow||1931-1933|
|Junior Fellow in the Society of Fellows at Harvard University||1933-1936|
|Joined Psychology Department at the University of Minnesota||Joined in 1936 and remained here until 1945|
|Conducted War Research Sponsored by General Mills Inc||1942-1943|
|Chairman of Department of Psychology at Indiana University||1945|
|Appointed as William James Lecturer by Harvard University||Fall of 1947|
|Joined Harvard’s Department of Psychology as Professor||1948|
|Became Professor Emeritus||1964|
|D.Soc.Stud||University of Louisville, 1977|
|Awards and Medals||
Burrhus Frederic Skinner (B.F. Skinner) was one of the most influential psychologists of the twentieth century and was a behaviorist. He believed that conditioning controlled all human behavior. Here, conditioning means specific conditions that lead to the desired outcome.
Thus, Skinner practiced a relatively different research style of analyzing human behavior.
As per his research style, the participants were exposed to certain conditions and then analyzed for the difference in their behavior.
Accordingly, Skinner’s behavior analysis came to be known as radical behaviorism.
This is because Skinner’s science of behavior emphasized that overt behavior, and not internal mental states, should be the focus of the study of psychology.
B.F. Skinner’s Significant Contributions to Psychology
As mentioned above, Skinner developed behavior analysis and proposed that human behavior was the outcome of conditioning.
B.F. Skinner made several important contributions to the field of psychology. These include:
- Operant Conditioning or Instrumental Conditioning
- Skinner Box
- Cumulative Recorder
- Radical Behaviorism
- Schedules of Reinforcement
Furthermore, he developed operant conditioning and hence was known the father of operant conditioning.
Thus, Skinner’s method of learning suggests that particular behavior is repeated if it is reinforced (that is strengthened) and is not repeated if it is not reinforced (that is weakened or extinguished).
In this, he considered the rate of response to be an effective measure of the response strength.
He even developed an operant conditioning chamber. This chamber is the Skinner Box, to study operant conditioning, and a cumulative recorder to measure the rate of response.
He suggested that in some cases, a behavior followed by rewards or reinforcement was governed by rules, known as schedules of reinforcement.
Skinner even established a research laboratory to study operant behavior and summarized his work in the book ‘Schedules of Reinforcement’ with C.B. Ferster.
B.F. Skinner Biography: Early Life
Date of Birth: March 20, 1904
Date of Death: August 18, 1990
As per B.F. Skinner Biography, Skinner was born in Susquehanna, a small town in the hills of Pennsylvania.
Furthermore, his mother, Grace Madge Burrhus, was a stenographer, secretary, and notary public first in a law office and then in a railroad chief executive’s office.
Skinner’s father, William Arthur Skinner, was a successful attorney who studied law with a local attorney and at New York Law School.
His parents were both good students themselves, cherished learning, and were determined both for themselves and Skinner.
Also, his father bought several books and, therefore, the inquisitive Skinner had enough books to read as a child.
B.F. Skinner was adventurous, and much of his boyhood was spent building things for he loved inventing.
For instance, he started the venture of selling elderberries door to door with his friend.
And for this business, he invented a floatation system that helped in separating the ripe berries from the green or unripe ones.
Skinner later wrote his autobiography in three volumes and much of what is known about his childhood is from Skinner’s recollection.
Both Skinner and his younger brother were required to follow proper rules as they grew up.
Skinner’s mother would disagree when he would falter on the code of conduct expected of him.
Likewise, as per Skinner’s recollection, his father was a gentle parent and never punished him physically.
Rather, he disciplined him via verbal disagreement and would remind him of the punishments that awaited in case he misbehaved.
In addition to this, Skinner was always willing to receive praise from his parents, although it was not given frequently.
It is, thereby, interesting to note that later his theory of operant conditioning pressed on the impact of positive reinforcement on an individual’s behavior.
B.F. Skinner’s Education
Hamilton College, Clinton, New York 1922-1926
B.F. Skinner joined Hamilton College in 1922 in New York and was interested in writing.
He met the poet Robert Frost at a summer school in Vermont, who asked Skinner to send him some of his writings.
On being praised by Frost for his work, Skinner decided to become a writer.
So, upon graduating from Hamilton College in 1926, Skinner moved back home to become a writer.
During this time, Skinner wrote a few newspaper articles and also created a few models of the sailing ships.
He describes his entire work from this period as the ‘dark year’. Thus, he wasn’t very successful as a writer and later said that “ I had nothing important to say”.
Skinner then escaped to New York City for a few months and worked as a clerk in a bookstore.
There, he chanced upon the books by Ivan Pavlov and John B. Watson, the founder of behaviorism.
Reading these books helped him collect his thoughts and visualize the work that would help explain human behavior.
Skinner was not interested in the conventional theories of psychology, which pressed on the Freudian belief of the inner self.
He was more interested in outward behavior and believed that overt behavior should be the focus of the study of psychology.
Joined Harvard University’s Graduate Program in Psychology in 1928
Skinner enrolled in the Psychology Department of Harvard University when he was 24 years old.
As mentioned earlier, B.F. Skinner was captured by the outward behavior and not the internal mental states of an individual.
He, therefore, resisted such ideas and found a mentor in William Crozier, who was the chair of the new Department of Physiology.
William too was intensely ambitious about studying the behavior of the animals and not the processes taking place inside.
Skinner’s behavioral learnings were supported by his fellow graduates.
Thus, Skinner received his M.A. degree in 1930 and a Ph.D. degree in 1931 from Harvard University.
Upon receiving the Ph.D., Skinner got a two-year National Research Council Fellowship (1931-1933) to work in W.J. Crozier’s central nervous system lab.
This is when Crozier would allow Skinner to work in his lab for half of the time so that he could progress in his research.
Post this, for the next three years, Skinner was supported as one of the Harvard Society of Fellows (1933-1936) that let him continue his research.
B.F. Skinner’s Theory Of Operant Conditioning
Given the five-year fellowship, Skinner was able to research behavior and relate it to the experimental conditions.
During his experiments with the rats, he discovered that the responses or the operants (as he called them) depended not only on the preceding stimulus but also on what followed such responses.
That is, the behavior would be repeated if it is reinforced and would not be repeated if it is punished. He called this type of learning behavior operant conditioning.
Accordingly, he identified two types of reinforcements:
- Positive Reinforcement – consequences that increase the rate of behavior that precedes them. For example, praise for your effort, getting to play your favorite game, etc
- Negative Reinforcement – consequences that increase the rate of specific behaviors that allow an individual to escape or avoid the aversive stimulus. For example, surrender to your child’s demands (behavior) to avoid his crankiness.
Likewise, punishment also is an important part of the operant conditioning process:
- Positive Punishment – consequences that weaken the rate of behavior that precedes such consequences. For example, prison, scolding from the teacher, etc.
- Negative Punishment – consequences that weaken the behavior that precedes them as consequences lead to the removal of positive reinforcers. For example, not allowing you to watch your favorite TV show.
Read More: Arousal Theory of Motivation
Thus, B.F Skinner distinguished between two types of behaviors:
These are the behaviors that are involuntary, automatic, or reflex actions, and are triggered by stimuli.
That is, such behaviors take place automatically in the presence of the stimuli.
For instance, salivating when one smells the lunch being cooked, feeling scared when watching a horror film, or pulling your hand off the hot stove.
Operants are responses that act on the environment and then generate consequences that decide whether such responses occur in the future or not.
These are voluntary, generated under conscious control.
Thus, operants are controlled by their consequences and future responses in the operant class are likely to occur only if such responses are encouraged.
The Skinner box also called an operant conditioning chamber, was the apparatus invented by B.F. Skinner.
It is the foundation of B.F. Skinner’s operant conditioning theory.
Skinner box is a soundproof, light-resistant box or chamber that Skinner used to study the learning process in small animals.
He typically used rats and pigeons for his experiments who were isolated in the chamber.
The operant chamber contains a bar-press lever or a key that is attached to a wall next to a food cup or dish.
This bar-press lever or key can be pressed or manipulated by the animal to obtain food pellets or water as a kind of reinforcement or punishment like an electric shock through the floor of the chamber.
Thus, when the level is pressed by the animal, food, water, and any type of reinforcement or punishment like the electric shock is provided.
In addition to this, various kinds of stimuli like lights, sounds, or images, can also be presented.
It also contains a device called operandum that automatically detects the action in the subject.
This device is typically connected to a monitoring device on the other side that records the responses generated by the subject.
B.F. Skinner established that consequences control the active behavior in operant conditioning and that such a behavior can be modified by the stimuli that set the occurrence of the consequences.
He established that effective stimuli and the reinforcing consequences can be identified by their effects on behavior.
Thus, he termed this conceptual unit of stimuli, responses, and consequence as discriminated operant.
Further, he claimed that the true measure of the effect of reinforcement is the rate of response in the presence of the discriminative stimulus.
This led him to invent the Cumulative Record. It is a device that can record the rate of response which can be directly observed by the experimenter.
In these records, the rate of response is determined by the slope of a line generated by back to back responses.
Further, the changes in the response rate can be established by the change in the slope of the line.
This helps to track the interactions between behavior and its consequences in an individual.
Schedules of Reinforcement
Schedules of reinforcement is an important part of operant conditioning.
Reinforcement Schedules are specific rules that determine when to reinforce a behavior, that is, after a specified time or number of responses.
These rules can be divided into two categories:
This is a kind of reinforcement schedule in which reinforcement is provided each time a behavior is showcased.
Partial or Intermittent Reinforcement
In the case of partial reinforcement, the occurrence of the reinforcement depends upon the time elapsed or the number of responses. These include:
- Fixed Interval Schedule
- Variable Interval Schedule
- Fixed Ratio Schedule
- Variable Ratio Schedule
Thus, in these five years, B.F. Skinner studied the impact of consequences and schedules on which such consequences were occasioned.
In addition to this, he studied how the prior stimuli impacted the behavior-consequence relationships that such stimuli were put together with.
This entire study later was included in his book ‘Behavior of Organisms’ in 1938.
In the year 1936, B.F. Skinner, then 32 years old, joined the Psychology Department at the University of Minnesota and married Yvonne Blue.
He stayed at the University of Minnesota until 1945. Skinner was busy with his family life as his elder daughter Julie was born in 1938.
Thus, he could not do much towards the operant behavior study that he had come out with.
In the year 1944, World War II was at its peak and one of the most troublesome things about the war was aerial bombing.
This was the dreadful new weapon to which no defense was available.
So, while on a train bound for the Midwestern Psychological Association meeting in Chicago in the year 1940, Skinner got influenced by a solution.
He wondered what if the bombers could be attacked with guided missiles dropped from higher altitudes.
Then, the United States did not have any expertise in guided missiles.
He described in his first thoughts “ I saw a flock of birds lifting and wheeling information as they flew alongside the train.
Suddenly I saw them as devices with excellent vision and extraordinary maneuverability. Could they not guide a missile.”
Training Pigeons Based on Operant Conditioning Principles
Thus, Skinner was successful in obtaining money for his project from General Mills Company in Minneapolis.
He started working on an apparatus that would transform pigeon’s pecking behavior into signals that could control the gliding bomb.
Skinner shaped their pecking behavior in such a way that pigeons would start responding consistently and accurately to a sample visual target.
The visual target in this case was the aerial photograph of a particular street intersection in the city of Stalingrad.
Skinner then presented his plan which he called the ‘ Bird’s Eye Bomb’.
This was based on the principles of Operant Conditioning which could be easily applied to the pigeons.
Pigeons made for good subjects as they had a sharp vision and did not get irritated by the noise and speed.
Steering Mechanism Used in the Bird’s Eye Bomb
Skinner placed a plastic screen behind a lens in the nose cone of the missile.
As the bomb pointed downward towards the target, the image of the target flashed onto the screen.
This activated the pecking behavior of the pigeons since they were trained for the same the moment they saw the target images.
The pecking of the pigeons on the image appearing on the screen further activated electrical contacts producing signals to manage steering controls.
In 1943 Skinner’s wife was pregnant again. This was the time when both Skinner and his wife Yvonne thought of solving the problems of the nursery by inventing an inexpensive apparatus.
This was a crib for his younger daughter Deborah that would help in saving the intensive labor that went into the care of the babies.
Thus, Skinner came out with his new invention, an enclosed and heated crib, and thereafter sent an article to the popular magazine Ladies’ Home Journal.
The article came out with the title ‘Baby in a Box’. Skinner called this crib as Baby Tender, which he invented by going through the hectic schedule of a young mother.
He studied the difficult schedule to understand the practices that were important for the physical and psychological health of the baby.
Challenges that Baby Tender Dealt With
Skinner’s Baby Tender dealt with various problems that young mothers typically faced. These included:
- issues of warmth as the temperature control allowed for maintaining the inside temperature of the crib. This helped in doing away with the issues of baby sweating or lying cold and uncovered.
- lack of exercise and growth as clothes of the baby acted as an obstacle. So Skinner decided to do away with the clothing and bedding of the baby in the crib, except for the diaper.
- crying and fussing of the baby which could be stopped by lowering the temperature in the crib.
- time and labor which now could be saved to a great extent on the part of the mother
- infections and allergies which could be kept at bay as the crib was covered with glass.
Baby Tender was objected on various grounds. These included:
- rasing a ‘softie’ instead of a baby who was prepared to take the challenges of life.
- critics themselves being hesitant to live in such an enclosed compartment as they would feel claustrophobic. Hence, the question of putting their babies in the crib did not arise.
- depriving the baby of social life, love, and affection that he/she needs
- encouraging neglect over better care