Skinner’s Box Experiment (Behaviorism Study)

practical psychology logo

We receive rewards and punishments for many behaviors. More importantly, once we experience that reward or punishment, we are likely to perform (or not perform) that behavior again in anticipation of the result. 

Psychologists in the late 1800s and early 1900s believed that rewards and punishments were crucial to shaping and encouraging voluntary behavior. But they needed a way to test it. And they needed a name for how rewards and punishments shaped voluntary behaviors. Along came Burrhus Frederic Skinner , the creator of Skinner's Box, and the rest is history.

BF Skinner

What Is Skinner's Box?

The "Skinner box" is a setup used in animal experiments. An animal is isolated in a box equipped with levers or other devices in this environment. The animal learns that pressing a lever or displaying specific behaviors can lead to rewards or punishments.

This setup was crucial for behavioral psychologist B.F. Skinner developed his theories on operant conditioning. It also aided in understanding the concept of reinforcement schedules.

Here, "schedules" refer to the timing and frequency of rewards or punishments, which play a key role in shaping behavior. Skinner's research showed how different schedules impact how animals learn and respond to stimuli.

Who is B.F. Skinner?

Burrhus Frederic Skinner, also known as B.F. Skinner is considered the “father of Operant Conditioning.” His experiments, conducted in what is known as “Skinner’s box,” are some of the most well-known experiments in psychology. They helped shape the ideas of operant conditioning in behaviorism.

Law of Effect (Thorndike vs. Skinner) 

At the time, classical conditioning was the top theory in behaviorism. However, Skinner knew that research showed that voluntary behaviors could be part of the conditioning process. In the late 1800s, a psychologist named Edward Thorndike wrote about “The Law of Effect.” He said, “Responses that produce a satisfying effect in a particular situation become more likely to occur again in that situation, and responses that produce a discomforting effect become less likely to occur again in that situation.”

Thorndike tested out The Law of Effect with a box of his own. The box contained a maze and a lever. He placed a cat inside the box and a fish outside the box. He then recorded how the cats got out of the box and ate the fish. 

Thorndike noticed that the cats would explore the maze and eventually found the lever. The level would let them out of the box, leading them to the fish faster. Once discovering this, the cats were more likely to use the lever when they wanted to get fish. 

Skinner took this idea and ran with it. We call the box where animal experiments are performed "Skinner's box."

Why Do We Call This Box the "Skinner Box?"

Edward Thorndike used a box to train animals to perform behaviors for rewards. Later, psychologists like Martin Seligman used this apparatus to observe "learned helplessness." So why is this setup called a "Skinner Box?" Skinner not only used Skinner box experiments to show the existence of operant conditioning, but he also showed schedules in which operant conditioning was more or less effective, depending on your goals. And that is why he is called The Father of Operant Conditioning.

Skinner's Box Example

How Skinner's Box Worked

Inspired by Thorndike, Skinner created a box to test his theory of Operant Conditioning. (This box is also known as an “operant conditioning chamber.”)

The box was typically very simple. Skinner would place the rats in a Skinner box with neutral stimulants (that produced neither reinforcement nor punishment) and a lever that would dispense food. As the rats started to explore the box, they would stumble upon the level, activate it, and get food. Skinner observed that they were likely to engage in this behavior again, anticipating food. In some boxes, punishments would also be administered. Martin Seligman's learned helplessness experiments are a great example of using punishments to observe or shape an animal's behavior. Skinner usually worked with animals like rats or pigeons. And he took his research beyond what Thorndike did. He looked at how reinforcements and schedules of reinforcement would influence behavior. 

About Reinforcements

Reinforcements are the rewards that satisfy your needs. The fish that cats received outside of Thorndike’s box was positive reinforcement. In Skinner box experiments, pigeons or rats also received food. But positive reinforcements can be anything added after a behavior is performed: money, praise, candy, you name it. Operant conditioning certainly becomes more complicated when it comes to human reinforcements.

Positive vs. Negative Reinforcements 

Skinner also looked at negative reinforcements. Whereas positive reinforcements are given to subjects, negative reinforcements are rewards in the form of things taken away from subjects. In some experiments in the Skinner box, he would send an electric current through the box that would shock the rats. If the rats pushed the lever, the shocks would stop. The removal of that terrible pain was a negative reinforcement. The rats still sought the reinforcement but were not gaining anything when the shocks ended. Skinner saw that the rats quickly learned to turn off the shocks by pushing the lever. 

About Punishments

Skinner's Box also experimented with positive or negative punishments, in which harmful or unsatisfying things were taken away or given due to "bad behavior." For now, let's focus on the schedules of reinforcement.

Schedules of Reinforcement 

Operant Conditioning Example

We know that not every behavior has the same reinforcement every single time. Think about tipping as a rideshare driver or a barista at a coffee shop. You may have a string of customers who tip you generously after conversing with them. At this point, you’re likely to converse with your next customer. But what happens if they don’t tip you after you have a conversation with them? What happens if you stay silent for one ride and get a big tip? 

Psychologists like Skinner wanted to know how quickly someone makes a behavior a habit after receiving reinforcement. Aka, how many trips will it take for you to converse with passengers every time? They also wanted to know how fast a subject would stop conversing with passengers if you stopped getting tips. If the rat pulls the lever and doesn't get food, will they stop pulling the lever altogether?

Skinner attempted to answer these questions by looking at different schedules of reinforcement. He would offer positive reinforcements on different schedules, like offering it every time the behavior was performed (continuous reinforcement) or at random (variable ratio reinforcement.) Based on his experiments, he would measure the following:

  • Response rate (how quickly the behavior was performed)
  • Extinction rate (how quickly the behavior would stop) 

He found that there are multiple schedules of reinforcement, and they all yield different results. These schedules explain why your dog may not be responding to the treats you sometimes give him or why gambling can be so addictive. Not all of these schedules are possible, and that's okay, too.

Continuous Reinforcement

If you reinforce a behavior repeatedly, the response rate is medium, and the extinction rate is fast. The behavior will be performed only when reinforcement is needed. As soon as you stop reinforcing a behavior on this schedule, the behavior will not be performed.

Fixed-Ratio Reinforcement

Let’s say you reinforce the behavior every fourth or fifth time. The response rate is fast, and the extinction rate is medium. The behavior will be performed quickly to reach the reinforcement. 

Fixed-Interval Reinforcement

In the above cases, the reinforcement was given immediately after the behavior was performed. But what if the reinforcement was given at a fixed interval, provided that the behavior was performed at some point? Skinner found that the response rate is medium, and the extinction rate is medium. 

Variable-Ratio Reinforcement

Here's how gambling becomes so unpredictable and addictive. In gambling, you experience occasional wins, but you often face losses. This uncertainty keeps you hooked, not knowing when the next big win, or dopamine hit, will come. The behavior gets reinforced randomly. When gambling, your response is quick, but it takes a long time to stop wanting to gamble. This randomness is a key reason why gambling is highly addictive.

Variable-Interval Reinforcement

Last, the reinforcement is given out at random intervals, provided that the behavior is performed. Health inspectors or secret shoppers are commonly used examples of variable-interval reinforcement. The reinforcement could be administered five minutes after the behavior is performed or seven hours after the behavior is performed. Skinner found that the response rate for this schedule is fast, and the extinction rate is slow. 

Skinner's Box and Pigeon Pilots in World War II

Yes, you read that right. Skinner's work with pigeons and other animals in Skinner's box had real-life effects. After some time training pigeons in his boxes, B.F. Skinner got an idea. Pigeons were easy to train. They can see very well as they fly through the sky. They're also quite calm creatures and don't panic in intense situations. Their skills could be applied to the war that was raging on around him.

B.F. Skinner decided to create a missile that pigeons would operate. That's right. The U.S. military was having trouble accurately targeting missiles, and B.F. Skinner believed pigeons could help. He believed he could train the pigeons to recognize a target and peck when they saw it. As the pigeons pecked, Skinner's specially designed cockpit would navigate appropriately. Pigeons could be pilots in World War II missions, fighting Nazi Germany.

When Skinner proposed this idea to the military, he was met with skepticism. Yet, he received $25,000 to start his work on "Project Pigeon." The device worked! Operant conditioning trained pigeons to navigate missiles appropriately and hit their targets. Unfortunately, there was one problem. The mission killed the pigeons once the missiles were dropped. It would require a lot of pigeons! The military eventually passed on the project, but cockpit prototypes are on display at the American History Museum. Pretty cool, huh?

Examples of Operant Conditioning in Everyday Life

Not every example of operant conditioning has to end in dropping missiles. Nor does it have to happen in a box in a laboratory! You might find that you have used operant conditioning on yourself, a pet, or a child whose behavior changes with rewards and punishments. These operant conditioning examples will look into what this process can do for behavior and personality.

Hot Stove: If you put your hand on a hot stove, you will get burned. More importantly, you are very unlikely to put your hand on that hot stove again. Even though no one has made that stove hot as a punishment, the process still works.

Tips: If you converse with a passenger while driving for Uber, you might get an extra tip at the end of your ride. That's certainly a great reward! You will likely keep conversing with passengers as you drive for Uber. The same type of behavior applies to any service worker who gets tips!

Training a Dog: If your dog sits when you say “sit,” you might treat him. More importantly, they are likely to sit when you say, “sit.” (This is a form of variable-ratio reinforcement. Likely, you only treat your dog 50-90% of the time they sit. If you gave a dog a treat every time they sat, they probably wouldn't have room for breakfast or dinner!)

Operant Conditioning Is Everywhere!

We see operant conditioning training us everywhere, intentionally or unintentionally! Game makers and app developers design their products based on the "rewards" our brains feel when seeing notifications or checking into the app. Schoolteachers use rewards to control their unruly classes. Dog training doesn't always look different from training your child to do chores. We know why this happens, thanks to experiments like the ones performed in Skinner's box. 

Related posts:

  • Operant Conditioning (Examples + Research)
  • Edward Thorndike (Psychologist Biography)
  • Schedules of Reinforcement (Examples)
  • B.F. Skinner (Psychologist Biography)
  • Fixed Ratio Reinforcement Schedule (Examples)

Reference this article:

About The Author

Photo of author

Free Personality Test

Free Personality Quiz

Free Memory Test

Free Memory Test

Free IQ Test

Free IQ Test

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

Operant Conditioning: What It Is, How It Works, and Examples

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Learn about our Editorial Process

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Operant conditioning, or instrumental conditioning, is a theory of learning where behavior is influenced by its consequences. Behavior that is reinforced (rewarded) will likely be repeated, and behavior that is punished will occur less frequently.

By the 1920s, John B. Watson had left academic psychology, and other behaviorists were becoming influential, proposing new forms of learning other than classical conditioning . Perhaps the most important of these was Burrhus Frederic Skinner. Although, for obvious reasons, he is more commonly known as B.F. Skinner.

Skinner’s views were slightly less extreme than Watson’s (1913). Skinner believed that we do have such a thing as a mind, but that it is simply more productive to study observable behavior rather than internal mental events.

Skinner’s work was rooted in the view that classical conditioning was far too simplistic to fully explain complex human behavior. He believed that the best way to understand behavior is to examine its causes and consequences. He called this approach operant conditioning.

operant Conditioning quick facts

How It Works

Skinner is regarded as the father of Operant Conditioning, but his work was based on Thorndike’s (1898) Law of Effect . According to this principle, behavior that is followed by pleasant consequences is likely to be repeated, and behavior followed by unpleasant consequences is less likely to be repeated.

Skinner introduced a new term into the Law of Effect – Reinforcement. Behavior that is reinforced tends to be repeated (i.e., strengthened); behavior that is not reinforced tends to die out or be extinguished (i.e., weakened).

Skinner (1948) studied operant conditioning by conducting experiments using animals, which he placed in a “ Skinner Box, ” which was similar to Thorndike’s puzzle box.

Skinner box or operant conditioning chamber experiment outline diagram. Labeled educational laboratory apparatus structure for mouse or rat experiment to understand animal behavior vector illustration

A Skinner box, also known as an operant conditioning chamber, is a device used to objectively record an animal’s behavior in a compressed time frame. An animal can be rewarded or punished for engaging in certain behaviors, such as lever pressing (for rats) or key pecking (for pigeons).

Skinner identified three types of responses, or operant, that can follow behavior.

  • Neutral operants : Responses from the environment that neither increase nor decrease the probability of a behavior being repeated.
  • Reinforcers : Responses from the environment that increase the probability of a behavior being repeated. Reinforcers can be either positive or negative.
  • Punishers : Responses from the environment that decrease the likelihood of a behavior being repeated. Punishment weakens behavior.

We can all think of examples of how reinforcers and punishers have affected our behavior. As a child, you probably tried out a number of behaviors and learned from their consequences.

For example, when you were younger, if you tried smoking at school, and the chief consequence was that you got in with the crowd you always wanted to hang out with, you would have been positively reinforced (i.e., rewarded) and would be likely to repeat the behavior.

If, however, the main consequence was that you were caught, caned, suspended from school, and your parents became involved, you would most certainly have been punished, and you would consequently be much less likely to smoke now.

Positive Reinforcement

B. F. Skinner’s theory of operant conditioning describes positive reinforcement. In positive reinforcement, a response or behavior is strengthened by rewards, leading to the repetition of the desired behavior. The reward is a reinforcing stimulus.

Primary reinforcers are stimuli that are naturally reinforcing because they are not learned and directly satisfy a need, such as food or water.

Secondary reinforcers are stimuli that are reinforced through their association with a primary reinforcer, such as money, school grades. They do not directly satisfy an innate need but may be the means.  So a secondary reinforcer can be just as powerful a motivator as a primary reinforcer.

Skinner showed how positive reinforcement worked by placing a hungry rat in his Skinner box. The box contained a lever on the side, and as the rat moved about the box, it would accidentally knock the lever. Immediately, it did so that a food pellet would drop into a container next to the lever.

After being put in the box a few times, the rats quickly learned to go straight to the lever. The consequence of receiving food if they pressed the lever ensured that they would repeat the action again and again.

Positive reinforcement strengthens a behavior by providing a consequence an individual finds rewarding. For example, if your teacher gives you £5 each time you complete your homework (i.e., a reward), you will be more likely to repeat this behavior in the future, thus strengthening the behavior of completing your homework.

The Premack principle is a form of positive reinforcement in operant conditioning. It suggests using a preferred activity (high-probability behavior) as a reward for completing a less preferred one (low-probability behavior).

This method incentivizes the less desirable behavior by associating it with a desirable outcome, thus strengthening the less favored behavior.

Operant Conditioning Reinforcement 1

Negative Reinforcement

Negative reinforcement is the termination of an unpleasant state following a response.

This is known as negative reinforcement because it is the removal of an adverse stimulus which is ‘rewarding’ to the animal or person. Negative reinforcement strengthens behavior because it stops or removes an unpleasant experience.

For example, if you do not complete your homework, you give your teacher £5. You will complete your homework to avoid paying £5, thus strengthening the behavior of completing your homework.

Skinner showed how negative reinforcement worked by placing a rat in his Skinner box and then subjecting it to an unpleasant electric current which caused it some discomfort. As the rat moved about the box it would accidentally knock the lever.

Immediately, it did so the electric current would be switched off. The rats quickly learned to go straight to the lever after being put in the box a few times. The consequence of escaping the electric current ensured that they would repeat the action again and again.

In fact, Skinner even taught the rats to avoid the electric current by turning on a light just before the electric current came on. The rats soon learned to press the lever when the light came on because they knew that this would stop the electric current from being switched on.

These two learned responses are known as Escape Learning and Avoidance Learning .

Punishment is the opposite of reinforcement since it is designed to weaken or eliminate a response rather than increase it. It is an aversive event that decreases the behavior that it follows.

Like reinforcement, punishment can work either by directly applying an unpleasant stimulus like a shock after a response or by removing a potentially rewarding stimulus, for instance, deducting someone’s pocket money to punish undesirable behavior.

Note : It is not always easy to distinguish between punishment and negative reinforcement.

They are two distinct methods of punishment used to decrease the likelihood of a specific behavior occurring again, but they involve different types of consequences:

Positive Punishment :

  • Positive punishment involves adding an aversive stimulus or something unpleasant immediately following a behavior to decrease the likelihood of that behavior happening in the future.
  • It aims to weaken the target behavior by associating it with an undesirable consequence.
  • Example : A child receives a scolding (an aversive stimulus) from their parent immediately after hitting their sibling. This is intended to decrease the likelihood of the child hitting their sibling again.

Negative Punishment :

  • Negative punishment involves removing a desirable stimulus or something rewarding immediately following a behavior to decrease the likelihood of that behavior happening in the future.
  • It aims to weaken the target behavior by taking away something the individual values or enjoys.
  • Example : A teenager loses their video game privileges (a desirable stimulus) for not completing their chores. This is intended to decrease the likelihood of the teenager neglecting their chores in the future.
There are many problems with using punishment, such as:
  • Punished behavior is not forgotten, it’s suppressed – behavior returns when punishment is no longer present.
  • Causes increased aggression – shows that aggression is a way to cope with problems.
  • Creates fear that can generalize to undesirable behaviors, e.g., fear of school.
  • Does not necessarily guide you toward desired behavior – reinforcement tells you what to do, and punishment only tells you what not to do.

Examples of Operant Conditioning

Positive Reinforcement : Suppose you are a coach and want your team to improve their passing accuracy in soccer. When the players execute accurate passes during training, you praise their technique. This positive feedback encourages them to repeat the correct passing behavior.

Negative Reinforcement : If you notice your team working together effectively and exhibiting excellent team spirit during a tough training session, you might end the training session earlier than planned, which the team perceives as a relief. They understand that teamwork leads to positive outcomes, reinforcing team behavior.

Negative Punishment : If an office worker continually arrives late, their manager might revoke the privilege of flexible working hours. This removal of a positive stimulus encourages the employee to be punctual.

Positive Reinforcement : Training a cat to use a litter box can be achieved by giving it a treat each time it uses it correctly. The cat will associate the behavior with the reward and will likely repeat it.

Negative Punishment : If teenagers stay out past their curfew, their parents might take away their gaming console for a week. This makes the teenager more likely to respect their curfew in the future to avoid losing something they value.

Ineffective Punishment : Your child refuses to finish their vegetables at dinner. You punish them by not allowing dessert, but the child still refuses to eat vegetables next time. The punishment seems ineffective.

Premack Principle Application : You could motivate your child to eat vegetables by offering an activity they love after they finish their meal. For instance, for every vegetable eaten, they get an extra five minutes of video game time. They value video game time, which might encourage them to eat vegetables.

Other Premack Principle Examples :

  • A student who dislikes history but loves art might earn extra time in the art studio for each history chapter reviewed.
  • For every 10 minutes a person spends on household chores, they can spend 5 minutes on a favorite hobby.
  • For each successful day of healthy eating, an individual allows themselves a small piece of dark chocolate at the end of the day.
  • A child can choose between taking out the trash or washing the dishes. Giving them the choice makes them more likely to complete the chore willingly.

Skinner’s Pigeon Experiment

B.F. Skinner conducted several experiments with pigeons to demonstrate the principles of operant conditioning.

One of the most famous of these experiments is often colloquially referred to as “ Superstition in the Pigeon .”

This experiment was conducted to explore the effects of non-contingent reinforcement on pigeons, leading to some fascinating observations that can be likened to human superstitions.

Non-contingent reinforcement (NCR) refers to a method in which rewards (or reinforcements) are delivered independently of the individual’s behavior. In other words, the reinforcement is given at set times or intervals, regardless of what the individual is doing.

The Experiment:

  • Pigeons were brought to a state of hunger, reduced to 75% of their well-fed weight.
  • They were placed in a cage with a food hopper that could be presented for five seconds at a time.
  • Instead of the food being given as a result of any specific action by the pigeon, it was presented at regular intervals, regardless of the pigeon’s behavior.

Observation:

  • Over time, Skinner observed that the pigeons began to associate whatever random action they were doing when food was delivered with the delivery of the food itself.
  • This led the pigeons to repeat these actions, believing (in anthropomorphic terms) that their behavior was causing the food to appear.
  • In most cases, pigeons developed different “superstitious” behaviors or rituals. For instance, one pigeon would turn counter-clockwise between food presentations, while another would thrust its head into a cage corner.
  • These behaviors did not appear until the food hopper was introduced and presented periodically.
  • These behaviors were not initially related to the food delivery but became linked in the pigeon’s mind due to the coincidental timing of the food dispensing.
  • The behaviors seemed to be associated with the environment, suggesting the pigeons were responding to certain aspects of their surroundings.
  • The rate of reinforcement (how often the food was presented) played a significant role. Shorter intervals between food presentations led to more rapid and defined conditioning.
  • Once a behavior was established, the interval between reinforcements could be increased without diminishing the behavior.

Superstitious Behavior:

The pigeons began to act as if their behaviors had a direct effect on the presentation of food, even though there was no such connection. This is likened to human superstitions, where rituals are believed to change outcomes, even if they have no real effect.

For example, a card player might have rituals to change their luck, or a bowler might make gestures believing they can influence a ball already in motion.

Conclusion:

This experiment demonstrates that behaviors can be conditioned even without a direct cause-and-effect relationship. Just like humans, pigeons can develop “superstitious” behaviors based on coincidental occurrences.

This study not only illuminates the intricacies of operant conditioning but also draws parallels between animal and human behaviors in the face of random reinforcements.

Schedules of Reinforcement

Imagine a rat in a “Skinner box.” In operant conditioning, if no food pellet is delivered immediately after the lever is pressed, then after several attempts, the rat stops pressing the lever (how long would someone continue to go to work if their employer stopped paying them?). The behavior has been extinguished.

Behaviorists discovered that different patterns (or schedules) of reinforcement had different effects on the speed of learning and extinction. Ferster and Skinner (1957) devised different ways of delivering reinforcement and found that this had effects on

1. The Response Rate – The rate at which the rat pressed the lever (i.e., how hard the rat worked).

2. The Extinction Rate – The rate at which lever pressing dies out (i.e., how soon the rat gave up).

How Reinforcement Schedules Work

Skinner found that variable-ratio reinforcement produces the slowest rate of extinction (i.e., people will continue repeating the behavior for the longest time without reinforcement). The type of reinforcement with the quickest rate of extinction is continuous reinforcement.

(A) Continuous Reinforcement

An animal or human is positively reinforced every time a specific behavior occurs, e.g., every time a lever is pressed, a pellet is delivered, and then food delivery is shut off.

  • Response rate is SLOW
  • Extinction rate is FAST

(B) Fixed Ratio Reinforcement

Behavior is reinforced only after the behavior occurs a specified number of times. e.g., one reinforcement is given after every so many correct responses, e.g., after every 5th response. For example, a child receives a star for every five words spelled correctly.

  • Response rate is FAST
  • Extinction rate is MEDIUM

(C) Fixed Interval Reinforcement

One reinforcement is given after a fixed time interval providing at least one correct response has been made. An example is being paid by the hour. Another example would be every 15 minutes (half hour, hour, etc.) a pellet is delivered (providing at least one lever press has been made) then food delivery is shut off.

  • Response rate is MEDIUM

(D) Variable Ratio Reinforcement

behavior is reinforced after an unpredictable number of times. For example, gambling or fishing.

  • Extinction rate is SLOW (very hard to extinguish because of unpredictability)

(E) Variable Interval Reinforcement

Providing one correct response has been made, reinforcement is given after an unpredictable amount of time has passed, e.g., on average every 5 minutes. An example is a self-employed person being paid at unpredictable times.

  • Extinction rate is SLOW

Applications In Psychology

1. behavior modification therapy.

Behavior modification is a set of therapeutic techniques based on operant conditioning (Skinner, 1938, 1953). The main principle comprises changing environmental events that are related to a person’s behavior. For example, the reinforcement of desired behaviors and ignoring or punishing undesired ones.

This is not as simple as it sounds — always reinforcing desired behavior, for example, is basically bribery.

There are different types of positive reinforcements. Primary reinforcement is when a reward strengths a behavior by itself. Secondary reinforcement is when something strengthens a behavior because it leads to a primary reinforcer.

Examples of behavior modification therapy include token economy and behavior shaping.

Token Economy

Token economy is a system in which targeted behaviors are reinforced with tokens (secondary reinforcers) and later exchanged for rewards (primary reinforcers).

Tokens can be in the form of fake money, buttons, poker chips, stickers, etc. While the rewards can range anywhere from snacks to privileges or activities. For example, teachers use token economy at primary school by giving young children stickers to reward good behavior.

Token economy has been found to be very effective in managing psychiatric patients . However, the patients can become over-reliant on the tokens, making it difficult for them to adjust to society once they leave prison, hospital, etc.

Staff implementing a token economy program have a lot of power. It is important that staff do not favor or ignore certain individuals if the program is to work. Therefore, staff need to be trained to give tokens fairly and consistently even when there are shift changes such as in prisons or in a psychiatric hospital.

Behavior Shaping

A further important contribution made by Skinner (1951) is the notion of behavior shaping through successive approximation.

Skinner argues that the principles of operant conditioning can be used to produce extremely complex behavior if rewards and punishments are delivered in such a way as to encourage move an organism closer and closer to the desired behavior each time.

In shaping, the form of an existing response is gradually changed across successive trials towards a desired target behavior by rewarding exact segments of behavior.

To do this, the conditions (or contingencies) required to receive the reward should shift each time the organism moves a step closer to the desired behavior.

According to Skinner, most animal and human behavior (including language) can be explained as a product of this type of successive approximation.

2. Educational Applications

In the conventional learning situation, operant conditioning applies largely to issues of class and student management, rather than to learning content. It is very relevant to shaping skill performance.

A simple way to shape behavior is to provide feedback on learner performance, e.g., compliments, approval, encouragement, and affirmation.

A variable-ratio produces the highest response rate for students learning a new task, whereby initial reinforcement (e.g., praise) occurs at frequent intervals, and as the performance improves reinforcement occurs less frequently, until eventually only exceptional outcomes are reinforced.

For example, if a teacher wanted to encourage students to answer questions in class they should praise them for every attempt (regardless of whether their answer is correct). Gradually the teacher will only praise the students when their answer is correct, and over time only exceptional answers will be praised.

Unwanted behaviors, such as tardiness and dominating class discussion can be extinguished through being ignored by the teacher (rather than being reinforced by having attention drawn to them). This is not an easy task, as the teacher may appear insincere if he/she thinks too much about the way to behave.

Knowledge of success is also important as it motivates future learning. However, it is important to vary the type of reinforcement given so that the behavior is maintained.

This is not an easy task, as the teacher may appear insincere if he/she thinks too much about the way to behave.

Operant Conditioning vs. Classical Conditioning

Learning type.

While both types of conditioning involve learning, classical conditioning is passive (automatic response to stimuli), while operant conditioning is active (behavior is influenced by consequences).

  • Classical conditioning links an involuntary response with a stimulus. It happens passively on the part of the learner, without rewards or punishments. An example is a dog salivating at the sound of a bell associated with food.
  • Operant conditioning connects voluntary behavior with a consequence. Operant conditioning requires the learner to actively participate and perform some type of action to be rewarded or punished. It’s active, with the learner’s behavior influenced by rewards or punishments. An example is a dog sitting on command to get a treat.

Learning Process

Classical conditioning involves learning through associating stimuli resulting in involuntary responses, while operant conditioning focuses on learning through consequences, shaping voluntary behaviors.

Over time, the person responds to the neutral stimulus as if it were the unconditioned stimulus, even when presented alone. The response is involuntary and automatic.

An example is a dog salivating (response) at the sound of a bell (neutral stimulus) after it has been repeatedly paired with food (unconditioned stimulus).

Behavior followed by pleasant consequences (rewards) is more likely to be repeated, while behavior followed by unpleasant consequences (punishments) is less likely to be repeated.

For instance, if a child gets praised (pleasant consequence) for cleaning their room (behavior), they’re more likely to clean their room in the future.

Conversely, if they get scolded (unpleasant consequence) for not doing their homework, they’re more likely to complete it next time to avoid the scolding.

Timing of Stimulus & Response

The timing of the response relative to the stimulus differs between classical and operant conditioning:

Classical Conditioning (response after the stimulus) : In this form of conditioning, the response occurs after the stimulus. The behavior (response) is determined by what precedes it (stimulus). 

For example, in Pavlov’s classic experiment, the dogs started to salivate (response) after they heard the bell (stimulus) because they associated it with food.

The anticipated consequence influences the behavior or what follows it. It is a more active form of learning, where behaviors are reinforced or punished, thus influencing their likelihood of repetition.

For example, a child might behave well (behavior) in anticipation of a reward (consequence), or avoid a certain behavior to prevent a potential punishment.

Looking at Skinner’s classic studies on pigeons’  and rats’ behavior, we can identify some of the major assumptions of the behaviorist approach .

• Psychology should be seen as a science , to be studied in a scientific manner. Skinner’s study of behavior in rats was conducted under carefully controlled laboratory conditions . • Behaviorism is primarily concerned with observable behavior, as opposed to internal events like thinking and emotion. Note that Skinner did not say that the rats learned to press a lever because they wanted food. He instead concentrated on describing the easily observed behavior that the rats acquired. • The major influence on human behavior is learning from our environment. In the Skinner study, because food followed a particular behavior the rats learned to repeat that behavior, e.g., operant conditioning. • There is little difference between the learning that takes place in humans and that in other animals. Therefore research (e.g., operant conditioning) can be carried out on animals (Rats / Pigeons) as well as on humans. Skinner proposed that the way humans learn behavior is much the same as the way the rats learned to press a lever.

So, if your layperson’s idea of psychology has always been of people in laboratories wearing white coats and watching hapless rats try to negotiate mazes to get to their dinner, then you are probably thinking of behavioral psychology.

Behaviorism and its offshoots tend to be among the most scientific of the psychological perspectives . The emphasis of behavioral psychology is on how we learn to behave in certain ways.

We are all constantly learning new behaviors and how to modify our existing behavior. Behavioral psychology is the psychological approach that focuses on how this learning takes place.

Critical Evaluation

Operant conditioning can  explain a wide variety of behaviors, from the learning process to addiction and  language acquisition . It also has practical applications (such as token economy) that can be used in classrooms, prisons,  and psychiatric hospitals.

Researchers have found innovative ways to apply operant conditioning principles to promote health and habit change in humans.

In a recent study, operant conditioning using virtual reality (VR) helped stroke patients use their weakened limb more often during rehabilitation. Patients shifted their weight in VR games by maneuvering a virtual object. When they increased weight on their weakened side, they received rewards like stars. This positive reinforcement conditioned greater paretic limb use (Kumar et al., 2019).

Another study utilized operant conditioning to assist smoking cessation. Participants earned vouchers exchangeable for goods and services for reducing smoking. This reward system reinforced decreasing cigarette use. Many participants achieved long-term abstinence (Dallery et al., 2017).

Through repeated reinforcement, operant conditioning can facilitate forming exercise and eating habits. A person trying to exercise more might earn TV time for every 10 minutes spent working out. An individual aiming to eat healthier may allow themselves a daily dark chocolate square for sticking to nutritious meals. Providing consistent rewards for desired actions can instill new habits (Michie et al., 2009).

Apps like Habitica apply operant conditioning by gamifying habit tracking. Users earn points and collect rewards in a fantasy game for completing real-life habits. This virtual reinforcement helps ingrain positive behaviors (Eckerstorfer et al., 2019).

Operant conditioning also shows promise for managing ADHD and OCD. Rewarding concentration and focus in ADHD children, for example, can strengthen their attention skills (Rosén et al., 2018). Similarly, reinforcing OCD patients for resisting compulsions may diminish obsessive behaviors (Twohig et al., 2018).

However, operant conditioning fails to take into account the role of inherited and cognitive factors in learning, and thus is an incomplete explanation of the learning process in humans and animals.

For example, Kohler (1924) found that primates often seem to solve problems in a flash of insight rather than be trial and error learning. Also, social learning theory (Bandura, 1977) suggests that humans can learn automatically through observation rather than through personal experience.

The use of animal research in operant conditioning studies also raises the issue of extrapolation. Some psychologists argue we cannot generalize from studies on animals to humans as their anatomy and physiology are different from humans, and they cannot think about their experiences and invoke reason, patience, memory or self-comfort.

Frequently Asked Questions

Who discovered operant conditioning.

Operant conditioning was discovered by B.F. Skinner, an American psychologist, in the mid-20th century. Skinner is often regarded as the father of operant conditioning, and his work extensively dealt with the mechanism of reward and punishment for behaviors, with the concept being that behaviors followed by positive outcomes are reinforced, while those followed by negative outcomes are discouraged.

How does operant conditioning differ from classical conditioning?

Operant conditioning differs from classical conditioning, focusing on how voluntary behavior is shaped and maintained by consequences, such as rewards and punishments.

In operant conditioning, a behavior is strengthened or weakened based on the consequences that follow it. In contrast, classical conditioning involves the association of a neutral stimulus with a natural response, creating a new learned response.

While both types of conditioning involve learning and behavior modification, operant conditioning emphasizes the role of reinforcement and punishment in shaping voluntary behavior.

How does operant conditioning relate to social learning theory?

Operant conditioning is a core component of social learning theory , which emphasizes the importance of observational learning and modeling in acquiring and modifying behavior.

Social learning theory suggests that individuals can learn new behaviors by observing others and the consequences of their actions, which is similar to the reinforcement and punishment processes in operant conditioning.

By observing and imitating models, individuals can acquire new skills and behaviors and modify their own behavior based on the outcomes they observe in others.

Overall, both operant conditioning and social learning theory highlight the importance of environmental factors in shaping behavior and learning.

What are the downsides of operant conditioning?

The downsides of using operant conditioning on individuals include the potential for unintended negative consequences, particularly with the use of punishment. Punishment may lead to increased aggression or avoidance behaviors.

Additionally, some behaviors may be difficult to shape or modify using operant conditioning techniques, particularly when they are highly ingrained or tied to complex internal states.

Furthermore, individuals may resist changing their behaviors to meet the expectations of others, particularly if they perceive the demands or consequences of the reinforcement or punishment to be undesirable or unjust.

What is an application of bf skinner’s operant conditioning theory?

An application of B.F. Skinner’s operant conditioning theory is seen in education and classroom management. Teachers use positive reinforcement (rewards) to encourage good behavior and academic achievement, and negative reinforcement or punishment to discourage disruptive behavior.

For example, a student may earn extra recess time (positive reinforcement) for completing homework on time, or lose the privilege to use class computers (negative punishment) for misbehavior.

Further Reading

  • Ivan Pavlov Classical Conditioning Learning and behavior PowerPoint
  • Ayllon, T., & Michael, J. (1959). The psychiatric nurse as a behavioral engineer. Journal of the Experimental Analysis of Behavior, 2(4), 323-334.
  • Bandura, A. (1977). Social learning theory . Englewood Cliffs, NJ: Prentice Hall.
  • Dallery, J., Meredith, S., & Glenn, I. M. (2017). A deposit contract method to deliver abstinence reinforcement for cigarette smoking. Journal of Applied Behavior Analysis, 50 (2), 234–248.
  • Eckerstorfer, L., Tanzer, N. K., Vogrincic-Haselbacher, C., Kedia, G., Brohmer, H., Dinslaken, I., & Corbasson, R. (2019). Key elements of mHealth interventions to successfully increase physical activity: Meta-regression. JMIR mHealth and uHealth, 7 (11), e12100.
  • Ferster, C. B., & Skinner, B. F. (1957). Schedules of reinforcement . New York: Appleton-Century-Crofts.
  • Kohler, W. (1924). The mentality of apes. London: Routledge & Kegan Paul.
  • Kumar, D., Sinha, N., Dutta, A., & Lahiri, U. (2019). Virtual reality-based balance training system augmented with operant conditioning paradigm.  Biomedical Engineering Online ,  18 (1), 1-23.
  • Michie, S., Abraham, C., Whittington, C., McAteer, J., & Gupta, S. (2009). Effective techniques in healthy eating and physical activity interventions: A meta-regression. Health Psychology, 28 (6), 690–701.
  • Rosén, E., Westerlund, J., Rolseth, V., Johnson R. M., Viken Fusen, A., Årmann, E., Ommundsen, R., Lunde, L.-K., Ulleberg, P., Daae Zachrisson, H., & Jahnsen, H. (2018). Effects of QbTest-guided ADHD treatment: A randomized controlled trial. European Child & Adolescent Psychiatry, 27 (4), 447–459.
  • Skinner, B. F. (1948). ‘Superstition’in the pigeon.  Journal of experimental psychology ,  38 (2), 168.
  • Schunk, D. (2016).  Learning theories: An educational perspective . Pearson.
  • Skinner, B. F. (1938). The behavior of organisms: An experimental analysis . New York: Appleton-Century.
  • Skinner, B. F. (1948). Superstition” in the pigeon . Journal of Experimental Psychology, 38 , 168-172.
  • Skinner, B. F. (1951). How to teach animals . Freeman.
  • Skinner, B. F. (1953). Science and human behavior . Macmillan.
  • Thorndike, E. L. (1898). Animal intelligence: An experimental study of the associative processes in animals. Psychological Monographs: General and Applied, 2(4), i-109.
  • Twohig, M. P., Whittal, M. L., Cox, J. M., & Gunter, R. (2010). An initial investigation into the processes of change in ACT, CT, and ERP for OCD. International Journal of Behavioral Consultation and Therapy, 6 (2), 67–83.
  • Watson, J. B. (1913). Psychology as the behaviorist views it . Psychological Review, 20 , 158–177.

Print Friendly, PDF & Email

  • Technophiles
  • Communities
  • Editors of The Future
  • B.F. Skinner, the man who taught pigeons to play ping-pong 

course details

Most Popular

Loading loading loading loading loading loading.

B. F. Skinner Foundation

Project Pigeon

[title size=”2″] Project Pigeon (Orcon) [/title]

video

B. F. Skinner wrote about his work during World War Two and about the Orcon Project. Download the article here: Pigeons in a Pelican.pdf.

Leave a Reply Cancel reply

EVERY PURCHASE FROM OUR ONLINE STORE BENEFITS THE FOUNDATION! Dismiss

CogniFit Blog: Brain Health News

CogniFit Blog: Brain Health News

Brain Training, Mental Health, and Wellness

operant conditioning

Operant Conditioning – 4 Interesting Experiments by B.F. Skinner

' src=

Operant conditioning might sound like something out of a dystopian novel. But it’s not. It’s a very real thing that was forged by a brilliant, yet quirky, psychologist. Today, we will take a quick look at his work as we as a few odd experiments that went with it…

There are few names in psychology more well-known than B. F. Skinner. First-year psychology students scribble endless lecture notes on him. Doctoral candidates cite his work in their dissertations as they test whether a rat’s behavior can be used to predict behavior in humans.

Skinner is one of the most well-known psychologists of our time that was famous for his experiments on operant conditioning. But how did he become such a central figure of these Intro to Psych courses? And, how did he develop his theories and methodologies cited by those sleep-deprived Ph.D. students?

THE FATHER OF OPERANT CONDITIONING

Skinner spent his life studying the way we behave and act. But, more importantly, how this behavior can be modified.

He viewed Ivan Pavlov’s classical model of behavioral conditioning as being “too simplistic a solution” to fully explain the complexities of human (and animal) behavior and learning. It was because of this, that Skinner started to look for a better way to explain why we do things.

His early work was based on Edward Thorndike’s 1989 Law of Effect . Skinner went on to expand on the idea that most of our behavior is directly related to the consequences of said behavior. His expanded model of behavioral learning would be called operant conditioning. This centered around two things…

  • The concepts of behaviors – the actions an organism or test subject exhibits
  • The operants – the environmental response/consequences directly following the behavior

But, it’s important to note that the term “consequences” can be misleading. This is because there doesn’t need to be a causal relationship between the behavior and the operant. Skinner broke these responses down into three parts.

1. REINFORCERS – These give the organism a desirable stimulus and serve to increase the frequency of the behavior.

2. PUNISHERS – These are environmental responses that present an undesirable stimulus and serve to reduce the frequency of the behavior.

3. NEUTRAL OPERANTS – As the name suggests, these present stimuli that neither increase nor decrease the tested behavior.

Throughout his long and storied career, Skinner performed a number of strange experiments trying to test the limits of how punishment and reinforcement affect behavior.

4 INTERESTING OPERANT EXPERIMENTS

Though Skinner was a professional through and through, he was also quite a quirky person. And, his unique ways of thinking are very clear in the strange and interesting experiments he performed while researching the properties of operant conditioning.

Experiment #1: The Operant Conditioning Chamber

The Operant Conditioning Chamber, better known as the Skinner Box , is a device that B.F. Skinner used in many of his experiments. At its most basic, the Skinner Box is a chamber where a test subject, such as a rat or a pigeon, must ‘learn’ the desired behavior through trial and error.

B.F. Skinner used this device for several different experiments. One such experiment involves placing a hungry rat into a chamber with a lever and a slot where food is dispensed when the lever is pressed. Another variation involves placing a rat into an enclosure that is wired with a slight electric current on the floor. When the current is turned on, the rat must turn a wheel in order to turn off the current.  

Though this is the most basic experiment in operant conditioning research, there is an infinite number of variations that can be created based on this simple idea.

Experiment #2: A Pigeon That Can Read

Building on the basic ideas from his work with the Operant Conditioning Chamber, B. F. Skinner eventually began designing more and more complex experiments.

One of these experiments involved teaching a pigeon to read words presented to it in order to receive food. Skinner began by teaching the pigeon a simple task, namely, pecking a colored disk, in order to receive a reward. He then began adding additional environmental cues (in this case, they were words), which were paired with a specific behavior that was required in order to receive the reward.

Through this evolving process, Skinner was able to teach the pigeon to ‘read’ and respond to several unique commands.

Though the pigeon can’t actually read English, the fact that he was able to teach a bird multiple behaviors, each one linked to a specific stimulus, by using operant conditioning shows us that this form of behavioral learning can be a powerful tool for teaching both animals and humans complex behaviors based on environmental cues.

Experiment #3: Pigeon Ping-Pong

But Skinner wasn’t only concerned with teaching pigeons how to read. It seems he also made sure they had time to play games as well. In one of his more whimsical experiments , B. F. Skinner taught a pair of common pigeons how to play a simplified version of table tennis.

The pigeons in this experiment were placed on either side of a box and were taught to peck the ball to the other bird’s side. If a pigeon was able to peck the ball across the table and past their opponent, they were rewarded with a small amount of food. This reward served to reinforce the behavior of pecking the ball past their opponent.

Though this may seem like a silly task to teach a bird, the ping-pong experiment shows that operant conditioning can be used not only for a specific, robot-like action but also to teach dynamic, goal-based behaviors.

Experiment #4: Pigeon-Guided Missiles

Thought pigeons playing ping-pong was as strange as things could get? Skinner pushed the envelope even further with his work on pigeon-guided missiles.

While this may sound like the crazy experiment of a deluded mad scientist, B. F. Skinner did actually do work to train pigeons to control the flight paths of missiles for the U.S. Army during the second world war.

Skinner began by training the pigeons to peck at shapes on a screen. Once the pigeons reliably tracked these shapes, Skinner was able to use sensors to track whether the pigeon’s beak was in the center of the screen, to one side or the other, or towards the top or bottom of the screen. Based on the relative location of the pigeon’s beak, the tracking system could direct the missile towards the target location.

Though the system was never used in the field due in part to advances in other scientific areas, it highlights the unique applications that can be created using operant training for animal behaviors.

THE CONTINUED IMPACT OF OPERANT CONDITIONING

B. F. Skinner is one of the most recognizable names in modern psychology, and with good reason. Though many of his experiments seem outlandish, the science behind them continues to impact us in ways we rarely think about.

The most prominent example is in the way we train animals for tasks such as search and rescue, companion services for the blind and disabled, and even how we train our furry friends at home—but the benefits of his research go far beyond teaching Fido how to roll over.

Operant conditioning research has found its way into the way schools motivate and discipline students, how prisons rehabilitate inmates, and even in how governments handle geopolitical relationships .

  • Category: Brain Health

psychology pigeon experiment

Pin It on Pinterest

Share this post with your friends!

  • Poets Weave Home
  • Indiana Public Media Home
  • WFIU Radio Home
  • Radio Program Schedule
  • WTIU TV Home
  • TV Program Schedule
  • Indiana Newsdesk
  • The Friday Zone
  • The Weekly Special
  • Journey Indiana
  • Arts & Culture
  • A Moment of Science
  • Café Indiana
  • Community Minute
  • Focus on Flowers
  • Indiandroid
  • Inner States
  • Just You & Me
  • Moment of Indiana History
  • Night Lights Classic Jazz

Noon Edition

  • The Poets Weave
  • Classical Music
  • Sylvia and Friends
  • Soul Kitchen
  • Broadcast Status

psychology pigeon experiment

Give Now  »

wfiu logo

Choose which station to support!

Indiana Public Media

  • Bloomington
  • Brown County
  • French Lick - West Baden
  • Greencastle
  • Greene County
  • Indianapolis
  • Martinsville
  • Terre Haute
  • WFIU Public Radio
  • Ask the Mayor
  • Crash Course in Islam
  • The Ernie Pyle Experiment!
  • Ether Game Podcast
  • Feat. Classical Recordings
  • Kinsey Confidential
  • Harmonia Early Music
  • Muslim Voices
  • Complete List (A-Z)
  • WTIU Public Television
  • WTIU Newsbreak

Indiana Public Media | WFIU - NPR | WTIU - PBS

psychology pigeon experiment

B.F. Skinner Created Superstitious Pigeons!

By -->william orem -->, posted march 31, 2004.

  • Listen in Popup
  • Download MP3

Okay, okay--you yourself aren't superstitious; you have a cool head and always think rationally. It's all those other dummies. Would you be surprised to learn, though, that human beings aren't the only animals to behave superstitiously? The psychologist B.F. Skinner showed that you can even form a superstitious belief...in a pigeon.

Yes, It's True...

Skinner describes the process in his paper "' Superstition' in the Pigeon ," printed in the Journal of Experimental Psychology back in 1948. He was interested in how animals respond to positive reinforcement--that is, getting something good when you behave a certain way.

Peck at the bar and food appears; soon you learn to peck for food. What happens though, if the food comes totally at random?

Pecking Pigeons

Imagine the pigeons at some average part of their day--pecking, scratching, looking for food. Every now and again, boom! A tasty pigeon snack appears. Hey, the pigeon thinks, that's great. Was it caused by the fact that I was foraging around over here?

If the pigeon tries foraging over there more often now, it's more likely to be doing just that the next time reinforcement occurs, even though the reinforcement itself is random. Gradually the particular type of foraging becomes associated with a reward.

Random Reinforcement

This random reinforcement, Skinner argued, is the animal version of superstition. Modern researchers think superstitious behaviors aren't totally random themselves, but Skinner's insight is still a powerful one.

If I wear this lucky shirt, you say, I'll have a good day! The more you wear that lucky shirt, the more likely it is you'll have it on when something good happens. You're on your way to a full-blown superstition.

Facebook

About A Moment of Science

A Moment of Science is a daily audio podcast, public radio program and video series providing the scientific story behind some of life's most perplexing mysteries. Learn More »

Psychologist World

Learn More Psychology

  • Cognitive Psychology

Superstition

How skinner's pigeon experiment revealed signs of superstition in pigeons..

Permalink Print   |  

Superstition

  • Behavioral Psychology
  • Biological Psychology
  • Body Language Interpretation
  • Developmental Psychology
  • Dream Interpretation
  • Freudian Psychology
  • Memory & Memory Techniques
  • Role Playing: Stanford Prison Experiment
  • Authoritarian Personality
  • Memory: Levels of Processing
  • Cold Reading: Psychology of Fortune Telling
  • Stages of Sleep
  • Personality Psychology
  • Why Do We Forget?
  • Psychology of Influence
  • Stress in Psychology
  • Body Language: How to Spot a Liar
  • Be a Better Communicator
  • Eye Reading: Body Language
  • Motivation: Maslow's Hierarchy of Needs
  • How to Interpret your Dreams Guide
  • How to Remember Your Dreams
  • Interpreting Your Dreams
  • Superstition in Pigeons
  • Altruism in Animals and Humans
  • Stimulus-Response Theory
  • Conditioned Behavior
  • Synesthesia: Mixing the Senses
  • Freudian Personality Type Test
  • ... and much more
  • Unlimited access to analysis of groundbreaking research and studies
  • 17+ psychology guides : develop your understanding of the mind
  • Self Help Audio : MP3 sessions to stream or download

Best Digital Psychology Magazine - UK

Best online psychology theory resource.

Which Archetype Are You?

Which Archetype Are You?

Are You Angry?

Are You Angry?

Windows to the Soul

Windows to the Soul

Are You Stressed?

Are You Stressed?

Attachment & Relationships

Attachment & Relationships

Memory Like A Goldfish?

Memory Like A Goldfish?

31 Defense Mechanisms

31 Defense Mechanisms

Slave To Your Role?

Slave To Your Role?

Which Archetype Are You?

Are You Fixated?

Are You Fixated?

Interpret Your Dreams

Interpret Your Dreams

How to Read Body Language

How to Read Body Language

How to Beat Stress and Succeed in Exams

psychology pigeon experiment

Psychology Topics

Learn psychology.

Sign Up

  • Access 2,200+ insightful pages of psychology explanations & theories
  • Insights into the way we think and behave
  • Body Language & Dream Interpretation guides
  • Self hypnosis MP3 downloads and more
  • Behavioral Approach
  • Eye Reading
  • Stress Test
  • Cognitive Approach
  • Fight-or-Flight Response
  • Neuroticism Test

© 2024 Psychologist World. Home About Contact Us Terms of Use Privacy & Cookies Hypnosis Scripts Sign Up

Pigeons, Operant Conditioning, and Social Control

Audrey watters.

This is the transcript of the talk I gave at the Tech4Good event I'm at this weekend in Albuquerque, New Mexico. The complete slide deck is here .

psychology pigeon experiment

I want to talk a little bit about a problem I see – or rather, a problem I see in the “solutions” that some scientists and technologists and engineers seem to gravitate towards. So I want to talk to you about pigeons, operant conditioning, and social control, which I recognize is a bit of a strange and academic title. I toyed with some others:

psychology pigeon experiment

I spent last week at the Harvard University archives, going through the papers of Professor B. F. Skinner, arguably one of the most important psychologists of the twentieth century. (The other, of course, being Sigmund Freud.)

psychology pigeon experiment

I don’t know how familiar this group is with Skinner – he’s certainly a name that those working in educational psychology have heard of. I’d make a joke here about software engineers having no background in the humanities or social sciences but I hear Mark Zuckerberg was actually a psych major at Harvard. (So that’s the joke.)

I actually want to make the case this morning that Skinner’s work – behavioral psychology in particular – has had profound influence on the development of computer science, particularly when it comes to the ways in which “programming” has become a kind of social engineering. I’m not sure this lineage is always explicitly considered – like I said, there’s that limited background in or appreciation for history thing your field seems to have got going on.

B. F. Skinner was a behaviorist. Indeed, almost all the American psychologists in the early twentieth century were. Unlike Freud, who was concerned with the subconscious mind, behaviorists like Skinner were interested in – well, as the name suggests – behaviors. Observable behaviors. Behaviors that could be conditioned or controlled.

psychology pigeon experiment

Skinner’s early work was with animals. As a graduate student at Harvard, he devised the operant conditioning chamber – better known as the Skinner box – that was used to study animal behavior. The chamber provided some sort of response mechanism that the animal would be trained to use, typically by rewarding the animal with food.

psychology pigeon experiment

During World War II, Skinner worked on a program called Project Pigeon – also known as Project Orcon, short for Organic Control – an experimental project to create pigeon-guided missiles.

The pigeons were trained by Skinner to peck at a target, and they rewarded with food when they completed the task correctly. Skinner designed a missile that carried pigeons which could see the target through the windows. The pigeons would peck at the target; the pecking in turn would control the missile’s tail fins, keeping it on course, via a metal conductor connected to the birds’ beak, transmitting the force of the pecking to the missile’s guidance system. The pigeons’ accuracy, according to Skinner’s preliminary tests: nearly perfect.

As part of their training, Skinner also tested the tenacity of the pigeons – testing their psychological fitness, if you will, for battle. He fired a pistol next to their heads to see if loud noise would disrupt their pecking. He put the pigeons in a pressure chamber, setting the altitude at 10,000 feet. The pigeons were whirled around in a centrifuge meant to simulate massive G forces; they were exposed to bright flashes meant to simulate shell bursts. The pigeons kept pecking. They had been trained, conditioned to do so.

The military canceled and revived Project Pigeon a couple of times, but Skinner’s ideas were never used in combat. “Our problem,” Skinner admitted, “was no one would take us seriously.” And by 1953, the military had devised an electronic system for missile guidance, so animal-guided systems were no longer necessary (if they ever were).

This research was all classified, and when the American public were introduced to Skinner’s well-trained pigeons in the 1950s, there was no reference to their proposed war-time duties. Rather, the media talked about his pigeons that could play ping-pong and piano.

Admittedly, part of my interest in Skinner’s papers at Harvard involved finding more about his research on pigeons. I use the pigeons as a visual metaphor throughout my work. And I could talk to you for an hour, easily, about the birds – indeed, I have given a keynote like that before. But I’m writing a book on the history of education technology, and B. F. Skinner is probably the name best known with “teaching machines” – that is, programmed instruction (pre-computer).

Skinner’s work on educational technology – on teaching and learning with machines – is connected directly, explicitly to his work with animals. Hence my usage of the pigeon imagery. Skinner believed that there was not enough (if any) of the right kind of behavior modification undertaken in schools. He pointed that that students are punished when they do something wrong – that’s the behavioral reinforcement that they receive: aversion. But students are rarely rewarded when they do something right. And again, this isn’t simply about “classroom behavior” – the kind of thing you get a grade for “good citizenship” on (not talking in class or cutting in the lunch line). Learning, to Skinner, was a behavior – and a behavior that needed what he called “contingencies of reinforcement.” These should be positive. They should minimize the chances of doing something wrong – getting the wrong answer, for example. (That’s why Skinner didn’t like multiple choice tests.) The reinforcement should be immediate.

psychology pigeon experiment

Skinner designed a teaching machine that he said would do all these things – allow the student to move at her own pace through the material. The student would know instantaneously if she had the answer right. (The reward was getting to move on to the next exciting question or concept.) And you can hear all this echoed in today’s education technology designers and developers and school reformers – from Sal Khan and Khan Academy to US Secretary of Education Betsy DeVos. It’s called “personalized learning.” But it’s essentially pigeon training with a snazzier interface.

“Once we have arranged the particular type of consequence called a reinforcement,” Skinner wrote in 1954 in “The Science of Learning and the Art of Teaching,” "our techniques permit us to shape the behavior of an organism almost at will. It has become a routine exercise to demonstrate this in classes in elementary psychology by conditioning such an organism as a pigeon.”

“ …Such an organism as a pigeon .” We often speak of “lab rats” as shorthand for the animals used in scientific experiments. We use the phrase too to describe people who work in labs, who are completely absorbed in performing their tasks again and again and again. In education and in education technology, students are also the subjects of experimentation and conditioning. In Skinner’s framework, they are not “lab rats”; they are pigeons . As he wrote,

…Comparable results have been obtained with pigeons, rats, dogs, monkeys, human children… and psychotic subjects. In spite of great phylogenetic differences, all these organisms show amazingly similar properties of the learning process. It should be emphasized that this has been achieved by analyzing the effects of reinforcement and by designing techniques that manipulate reinforcement with considerable precision. Only in this way can the behavior of the individual be brought under such precise control.

If we do not bring students’ behavior under control, Skinner cautioned, we will find ourselves “losing our pigeon.” The animal will be beyond our control.

Like I said, I’m writing a book. So I can talk at great length about Skinner and teaching machines. But I want folks to consider how behaviorism hasn’t just found its way into education reform or education technology. Indeed, Skinner and many others envisioned that application of operant conditioning outside of the laboratory, outside of the classroom – the usage (past and present) of behavior modification for social engineering is at the heart of a lot of “fixes” that people think they’re doing “for the sake of the children,” or “for the good of the country,” or “to make the world a better place.”

psychology pigeon experiment

Among the discoveries I made – new to me, not new to the world, to be clear: in the mid–1960s, B. F. Skinner was contacted by the Joseph P. Kennedy Jr. Foundation, a non-profit that funded various institutions and research projects that dealt with mental disabilities. Eunice Kennedy Shriver was apparently interested in his work on operant behavior and child-rearing, and her husband Sargent Shriver who’d been appointed by President Johnson to head the newly formed Office of Economic Opportunity was also keen to find ways to use operant conditioning as part of the War on Poverty.

There was a meeting. Skinner filed a report. But as he wrote in his autobiography, nothing came of it. “A year later,” he added, “one of Shriver’s aides came to see me about motivating the peasants in Venezuela.”

Motivating pigeons or poor people or peasants (or motivating peasants and poor people as pigeons) – it’s all offered, quite earnestly no doubt – as the ways in which science and scientific management will make the world better.

But if nothing else, the application of behavior modification to poverty implies that this is a psychological problem and not a structural one. Focus on the individual and their “mindset” – to use the language that education technology and educational psychology folks invoke these days – not on the larger, societal problems.

I recognize, of course, that you can say “it’s for their own good” – but it involves a great deal of hubris (and often historical and cultural ignorance, quite frankly) to assume that you know what “their own good” actually entails.

psychology pigeon experiment

You’ll sometimes hear that B. F. Skinner’s theories are no longer in fashion – the behaviorist elements of psychology have given way to the cognitive turn. And with or without developments in cognitive and neuroscience, Skinner’s star had certainly lost some of its luster towards the end of his career, particularly, as many like to tell the story, after Noam Chomsky penned a brutal review of his book Beyond Freedom and Dignity in the December 1971 issue of The New York Review of Books . In the book, Skinner argues that our ideas of freedom and free will and human dignity stand in the way of a behavioral science that can better organize and optimize society.

“Skinner’s science of human behavior, being quite vacuous, is as congenial to the libertarian as to the fascist,” writes Chomsky, adding that “there is nothing in Skinner’s approach that is incompatible with a police state in which rigid laws are enforced by people who are themselves subject to them and the threat of dire punishment hangs over all.”

Skinner argues in Beyond Freedom and Dignity that the goal of behavioral technologies should be to “design a world in which behavior likely to be punished seldom or never occurs” – a world of “automatic goodness.“ We should not be concerned with freedom, Skinner argues – that’s simply mysticism. We should pursue ”effectiveness of techniques of control“ which will ”make the world safer." Or make the world totalitarian, as Chomsky points out.

psychology pigeon experiment

Building behavioral technologies is, of course, what many computer scientists now do (perhaps what some of you do cough FitBit) – most, I’d say, firmly believing that they’re also building a world of “automatic goodness.” “Persuasive technologies,” as Stanford professor B. J. Fogg calls it. And in true Silicon Valley fashion, Fogg erases the long history of behavioral psychology in doing so: “the earliest signs of persuasive technology appeared in the 1970s and 1980s when a few computing systems were designed to promote health and increase workplace productivity,” he writes in his textbook. His students at his Behavioral Design Lab at Stanford have included Mike Krieger, the co-founder of Instagram, and Tristan Harris, a former Googler, founder of the Center for Humane Technology, and best known figure in what I call the “tech regrets industry” – he’s into “ethical” persuasive technologies now, you see.

Behavior modification. Behavioral conditioning. Behavioral design. Gamification. Operant conditioning. All practices and products and machines that are perhaps so ubiquitous in technology that we don’t see them – we just feel the hook and the urge for the features that reward us for behaving like those Project Pigeon birds pecking away at their target – not really aware of why there’s a war or what’s at stake or that we’re going to suffer and die if this missile runs its course. But nobody asked the pigeons. And even with the best of intentions for pigeons – promising pigeons an end to poverty and illiteracy, nobody asked the pigeons. Folks just assumed that because the smart men at Harvard (or Stanford or Silicon Valley or the US government) were on it, that it was surely right “fix.”

Published 15 Jun 2018

Hack Education

The history of the future of education technology.

The Psych Files

  • Biopsychology
  • Cognition, Intelligence and Language
  • Critical Thinking
  • Development
  • Gender/Sexuality
  • History of Psychology
  • I/O Psychology
  • Learning/Memory
  • Personality
  • Research and Stats
  • Social Psychology
  • Teaching Tools
  • Psych Resources

[email protected]

  • Uncategorized

He Taught a Pigeon to Peck a Ping Pong Ball. What Happened Next Will Shock You.

  • May 29, 2015
  • 3.08K Views

psychology pigeon experiment

Scientists discovered a way to get the lowly pigeon to play ping-pong.

Psychologist Burrhus Skinner (called “B.F.” by his BFF’s) says the remarkable feat was accomplished in just 3 steps:

  • He rewarded the pigeons when they were near the ball
  • rewarded them when they pecked the ball
  • rewarded them when they pecked the ball to the other side of the table

Skinner says he just put two such pigeons together in the same cage.

You won’t believe what happened next:

Skinner called his revolutionary technique “ Shaping ” and announced his findings on his twitter channel.

Skinner and Pigeons Playing Ping Pong

Not everyone was happy with what they saw

Skinner and Pigeons Playing Ping Pong

If you think you can handle it, the full video of Skinner’s work is show below (NSFW):

He Taught a Pigeon to Peck a Ping Pong Ball.  What Happened Next Will Shock You.

Ep: 238: A Robot’s Gender, Act Like A Girl and Be A Man

Ep 240: how do you treat people who have dementia.

He Taught a Pigeon to Peck a Ping Pong Ball.  What Happened Next Will Shock You.

About Author

Avatar

Thanks Michael Britt!!!!!! I have never seen this footage and I consider myself a “BFF” of Skinner! Hahaha Thanks for sharing this: The video is both shocking and “unmatched”!

Leave a comment

Your email address will not be published. Required fields are marked *

You may also like

Most popular episodes from 2012.

  • January 29, 2013

Ep 200: Reflections on the 200th Show!

  • August 16, 2013

AT THE SMITHSONIAN

B.f. skinner’s pigeon-guided rocket.

On this date 21 years ago, noted psychologist and inventor B.F. Skinner died; the American History Museum is home to one of his more unusual inventions

Joseph Stromberg

Joseph Stromberg

Nose Cone from B.F. Skinner's Pigeon-Guided Missile, on display in "Science in American Life."

It’s 1943, and America desperately needs a way to reliably bomb targets in Nazi Germany. What do we do? For B.F. Skinner, noted psychologist and inventor, the answer was obvious: pigeons.

“During World War II, there was a grave concern about aiming missiles,” says Peggy Kidwell , a curator of Medicine and Science at the American History Museum. “Military officials really wanted to figure out how to aim them accurately,”  Skinner approached the National Research Defense Committee with his plan, code-named “Project Pigeon.” Members of the committee were doubtful, but granted Skinner $25,000 to get started.

Skinner had already used pigeons in his psychological research, training them to press levers for food. An obsessive inventor, he had been pondering weapons targeting systems one day when he saw a flock of birds maneuvering in formation in the sky. “Suddenly I saw them as ‘devices’ with excellent vision and extraordinary maneuverability,” he said . “Could they not guide a missile? Was the answer to the problem waiting for me in my own back yard?”

Getting to work, Skinner decided on pigeons because of both their vision and unflappable behavior in chaotic conditions. He built a nose cone for a missile fitted with three small electronic screens and three tiny pigeon cockpits. Onto the screens was projected an image of the ground in front of the rocket.

“He would train street pigeons to recognize the pattern of the target, and to peck when they saw this target,” says Kidwell. “And then when all three of them pecked, it was thought you could actually aim the missile in that direction.” As the pigeons pecked, cables harnessed to each one’s head would mechanically steer the missile until it finally reached its mark. Alas, without an escape hatch, the birds would perish along with their target, making it a kamikaze mission.

Despite a successful demonstration of the trained pigeons, officials remained skeptical and eventually decided to terminate the project. Skinner, of course, would go on to become one of the country’s most influential psychologists, popularizing behaviorism, a conception of psychology that views behavior as a reaction to one’s environment.

He also kept inventing. As part of his research, Skinner designed a number of devices that used feedback processes to encourage learning. “After the war, he became very interested in machines for teaching people to do things,” says Kidwell. “In 1954, he had this machine for teaching arithmetic to young people, and in 1957 he designed a machine for teaching Harvard students basic natural sciences.”

Although Skinner’s machines were purely mechanical, the ideas he developed have been incorporated into many educational software programs in recent years, including some used in distance learning settings. “Many of his ideas are now most frequently seen by people as they have been incorporated in electronic testing. That programmed learning, where you have a series of questions, and responses, and based on the response you gave you are directed to the next question, is very much in a Skinnerian framework,” Kidwell says.

Skinner’s missile prototype, along with other teaching machines, came to the Smithsonian at the end of his career. “Skinner was a teacher of Uta C. Merzbach, who was a curator in this museum,” says Kidwell. “They had a very good relationship, so when he was writing his autobiography, when he had finished writing about a particular machine, he would give it to the museum.” The American History Museum is home to several Skinner teaching machines , as well as the missile, which is on display in the “ Science in American Life ” exhibition.

As for the pigeons? Skinner held on to them, and just out of curiosity, occasionally tested them  to see if their skills were still sharp enough for battle. One, two, four, and even six years later, the pigeons were still pecking strong.

Get the latest on what's happening At the Smithsonian in your inbox.

Joseph Stromberg

Joseph Stromberg | | READ MORE

Joseph Stromberg was previously a digital reporter for Smithsonian .

Chitra Ragavan

Decision-Making

How sunk cost fallacy impacts 2024 political decision-making, politicians must recognize when smart thinking turns into poor judgment..

Posted July 26, 2024 | Reviewed by Davia Sills

  • Politicians struggle with “sunk cost fallacy,” as Republicans regret Trump’s VP pick and Biden exits race.
  • The political sunk cost fallacy has haunted politicians in both parties for decades.
  • President Biden caved to pressure, ditched his sunk cost fallacy, and agreed to withdraw.
  • The assassination attempt of Trump coalesced Republicans despite grave reservations

Sora Shimazaki/Pexels

As some Republicans express buyer's remorse about Donald Trump's selection of J.D. Vance as the “worst choice” for vice president and Joe Biden ditches his stubbornness about staying in the 2024 presidential race, we're reminded that politicians, like business people, struggle with "sunk cost fallacy" and the poor cognitive decision-making that comes with it.

The term sunk cost fallacy was coined in 1980 by Richard Thaler, who received a Nobel Prize in 2017 for his pioneering work in behavioral economics . It describes how cognitive biases can lead people to invest in unsuccessful businesses, projects, or decisions—even something as simple as continuing to watch a boring movie—because of what they have already invested in these ventures.

Thaler suggested that “paying for the right to use a good or service will increase the rate at which the good will be utilized.”

One of Thaler’s classic examples was someone who bought a $40 ticket to a basketball game and then drove for miles through a snowstorm because of the sunk cost of buying the ticket in the first place.

In other words, the sunk cost fallacy results in people throwing good money after bad.

Psychologists Hal Arkes and Catherine Blumer took Thaler’s economic construct beyond money in numerous real-world experiments that demonstrated “a greater tendency to continue an endeavor once an investment in money, effort, or time has been made.”

In a paper in Psychology Research and Behavior Management , researchers Shao Hsi-Chung and Kuo Chi-Cheng explain how psychological discomfort influences the sunk cost effect. “... Cognitive dissonance is aroused by the amount of sunk costs in [the] prior decision,” the authors say, “...and the higher the fear for large loss results from immediate dropping out, the larger the cognitive dissonance that would emerge.”

While the authors weren’t talking about a political “dropping out,” like Biden ended up doing, or a political ditching, as some people are recommending that Trump do, their point is apt in these cases. Biden knew that pulling out of the race would come with a big personal and political price tag and a risk to his reputation and legacy. Trump also knows that dumping Vance could hurt his brand.

The “Concorde Fallacy”

The sunk cost fallacy is also known as the “Concorde Fallacy,” named after the supersonic jet that the British and French governments sunk billions of dollars into. In an op-ed in The Hill , before Biden withdrew, I argued that “Joe Biden’s self-inflicted electoral crisis is a classic case study in the ‘sunk cost fallacy’” and that he risked becoming the Concorde of politics .

In a Forbes article on the Concorde fallacy, the author Jim Blasingame said a mentor once asked him the “Concorde Question,” namely, “Do you have a fighting chance or just a chance to fight?”

That question was central to Biden’s dilemma. He felt that he was the best and only person to lead the nation and complete his mandate.

For Trump, the violent incident and riveting images from his attempted assassination, occurring on the eve of the Republican Convention in Milwaukee, where he was formally anointed as the Republican presidential candidate, gave the former president a massive boost within his own party.

Shunted aside were the deep concerns among many Republican voters and leaders, especially traditional conservatives, about Trump, a convicted felon confronting numerous legal battles, including allegedly mishandling classified documents and inciting the Jan 6, 2021, insurrection on the U.S. Capitol, charges that he has denied. However, the sunk cost fallacy, his near-brush with death, and resulting mythic status ensured that they would have to either hold their nose and vote for Trump or sit out the race.

psychology pigeon experiment

"Childless cat ladies" vs. a vice presidential pick

Now, a new conundrum has emerged for Trump with his vice presidential pick, Vance, alienating wide swaths of Americans, including actress Jennifer Aniston, as he took an indelicate swipe at “childless cat ladies” in a 2021 recent interview.

The political sunk cost fallacy has haunted politicians in both parties for decades as they have wrestled with decisions on whether to run for office, continue boondoggle projects, or send their soldiers to war.

Economist Abigail R. Hall Blanco frames her piece in the American Institute for Economic Research, titled “Sunk Cost Fallacy in the War on Terror,” using the question frequently used to rationalize continued U.S. military presence in foreign countries: “Do you want their deaths to mean nothing?!”

In a 2021 article in the National Review, writer Sean-Michael Pigeon gives numerous examples to support his belief that the sunk cost fallacy exacerbates government deficits. “No member of Congress wants to be publicly responsible for a half-built bridge,” Pigeon says, “especially when they have to tell taxpayers they still have to foot the bill for it.”

In a Washington Post analysis, behavioral economists Lior Sheffer and Peter John Loewen attribute the sunk cost fallacy to decisions by Republican leaders in Congress in 2017 when they embraced a controversial policy agenda—including the failed repeal of the Affordable Care Act and the passing of the deeply unpopular Tax Reform bill under then- President Trump .

In their study, published in the American Political Science Review, the authors interviewed several hundred incumbent lawmakers in Belgium, Canada, and Israel. They gave them a scenario in which an underperforming government-run small business loan program was due for an extension. To up the ante, the researchers also increased or decreased the budget shortfall (the decision’s sunk cost) and told some participants but not others that the media had come calling about the boondoggle. The participants were then asked whether they would vote for extending the program and, if so, by how much. They did, and by a lot.

The authors concluded that “politicians are just as susceptible to the sunk-cost fallacy and other decision-making biases as are regular folks—and sometimes even more so.”

In their related Washington Post article, Sheffer and Loewen recommend that “perhaps when citizens elect representatives, we should ask ourselves what reasoning skills those candidates bring with them to the table and what kind of outcomes they are likely to produce as a result.”

Arkes, H. R., & Blumer, C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140. https://doi.org/10.1016/0749-5978(85)90049-4

Blanco, A. (2021, December 17). The sunk cost fallacy in the War on Terror . AIER. https://www.aier.org/article/the-sunk-cost-fallacy-in-the-war-on-terror/

Butler, D. M., & Dynes, A. M. (2015). How politicians discount the opinions of constituents with whom they disagree. American Journal of Political Science , 60 (4), 975–989. https://doi.org/10.1111/ajps.12206

Fernando, J. (June 27, 2024). Opportunity cost: Definition, formula, and examples . Investopedia. https://www.investopedia.com/terms/o/opportunitycost.asp

Haita-Falah, C. (2017). Sunk-cost fallacy and cognitive ability in individual decision-making. Journal of Economic Psychology , 58 , 44–59. https://doi.org/10.1016/j.joep.2016.12.001

Ronayne, D., Sgroi, D., & Tuckwell, A. (2021). Evaluating the sunk cost effect. Journal of Economic Behavior & Organization , 186 , 318–327. https://doi.org/10.1016/j.jebo.2021.03.029

Sheffer, L., Loewen, P. J., Soroka, S., Walgrave, S., & Sheafer, T. (2018). Nonrepresentative Representatives: An Experimental Study of the Decision Making of Elected Politicians. American Political Science Review, 112(2), 302–321. doi:10.1017/S0003055417000569

Sweis, B. M., Abram, S. V., Schmidt, B. J., Seeland, K. D., MacDonald, A. W., Thomas, M. J., & Redish, A. D. (2018). Sensitivity to “sunk costs” in mice, rats, and humans. Science , 361 (6398), 178–181. https://doi.org/10.1126/science.aar8644

The sunk cost fallacy . The Decision Lab. (n.d.). https://thedecisionlab.com/biases/the-sunk-cost-fallacy#

Chitra Ragavan

Chitra Ragavan is an executive coach, communications strategist, and a former journalist at NPR and U.S. News & World Report.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

July 2024 magazine cover

Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

IMAGES

  1. Skinner Shaping Pigeon Turn Clip

    psychology pigeon experiment

  2. skinner pigeon experiment

    psychology pigeon experiment

  3. Way of Intervening: B.F. Skinner's Pigeon Experiment

    psychology pigeon experiment

  4. skinner pigeon experiment

    psychology pigeon experiment

  5. O In 1947, psychologist B. F. Skinner published an experiment where

    psychology pigeon experiment

  6. B F Skinner's Project Pigeon

    psychology pigeon experiment

VIDEO

  1. Pigeon Poti Under The Microscope

  2. pigeon experiment #facts #scinece

  3. DNA chicken & Pigeon Fusion #animal #satire #science #hewan #diluarnalar #wildanimalexperiment

  4. pigeon experiment

  5. Pigeon mating process

  6. NEXT LEVEL pigeon shooting

COMMENTS

  1. B.F. Skinner: The Man Who Taught Pigeons to Play Ping-Pong and Rats to

    March 20, 2013. Psychologist B.F. Skinner taught these pigeons to play ping-pong in 1950. Photo via Psychology Pictures. B.F Skinner, a leading 20th century psychologist who hypothesized that ...

  2. Skinner's Box Experiment (Behaviorism Study)

    His experiments, conducted in what is known as "Skinner's box," are some of the most well-known experiments in psychology. They helped shape the ideas of operant conditioning in behaviorism. Law of Effect (Thorndike vs. Skinner) ... Skinner's Box and Pigeon Pilots in World War II . Yes, you read that right. Skinner's work with pigeons and ...

  3. PDF 'Superstition' in the Pigeon

    presentation. A simple experiment demonstrates this to be the case. A pigeon is brought to a stable state of hunger by reducing it to 75 percent of its weight when well fed. It is put into an experimental cage for a few minutes each day. A food hopper attached to the cage may be swung into place so that the pigeon can eat from it. A solenoid

  4. B. F. Skinner

    Burrhus Frederic Skinner (March 20, 1904 - August 18, 1990) was an American psychologist, behaviorist, inventor, and social philosopher. He was the Edgar Pierce Professor of Psychology at Harvard University from 1958 until his retirement in 1974.. Considering free will to be an illusion, Skinner saw human action as dependent on consequences of previous actions, a theory he would articulate ...

  5. Operant Conditioning In Psychology: B.F. Skinner Theory

    B.F Skinner is regarded as the father of operant conditioning and introduced a new term to behavioral psychology, reinforcement. ... One of the most famous of these experiments is often colloquially referred to as "Superstition in the Pigeon." This experiment was conducted to explore the effects of non-contingent reinforcement on pigeons ...

  6. B.F. Skinner, the man who taught pigeons to play ping-pong

    Skinner carried out experiments with pigeons and rats to establish the concept of positive and negative reinforcements. B.F Skinner holding a pigeon used for his experiments during World War II. Over the years, many psychologists have attempted to decode the human mind and its labyrinths. You may have heard of Sigmund Freud and Carl Jung, two ...

  7. Skinner's Project Pigeon: One Of The First Teaching Machines

    The U.S. Navy gave Skinner a small grant to pursue the project where he constructed a set of optical lenses at the tip of the missile's nose cone, which focused the missile's forward view onto a screen placed in front of each of the three pigeons strapped inside. A pneumatic mechanism then steered the missile toward its target based on a ...

  8. Project Pigeon

    This silent video shows the project Skinner worked on during World War Two. The problem was that before radar, pilots trying to hit enemy ships flew so close that they were often shot down. Skinner realized he could teach pigeons to guide missiles. Pigeons were trained to peck an image that would look like a ship as a missile approached.

  9. 4 Interesting Experiments by B.F. Skinner

    Experiment #3: Pigeon Ping-Pong. But Skinner wasn't only concerned with teaching pigeons how to read. It seems he also made sure they had time to play games as well. In one of his more whimsical experiments, B. F. Skinner taught a pair of common pigeons how to play a simplified version of table tennis.. The pigeons in this experiment were placed on either side of a box and were taught to ...

  10. Operant Conditioning: Real Pigeon Experiment

    B. F. Skinner is a psychologist and a behaviorist he is amongst the firsts to study operant conditioning. He made many experiments on animals he would put th...

  11. 'Superstition' in the pigeon.

    "A pigeon is brought to a stable state of hunger by reducing it to 75 percent of its weight when well fed. It is put into an experimental cage for a few minutes each day. A food hopper attached to the cage may be swung into place so that the pigeon can eat from it. A solenoid and a timing relay hold the hopper in place for five sec. at each reinforcement. If a clock is now arranged to present ...

  12. B.F. Skinner Created Superstitious Pigeons!

    Skinner describes the process in his paper "' Superstition' in the Pigeon ," printed in the Journal of Experimental Psychology back in 1948. He was interested in how animals respond to positive ...

  13. Superstition

    The Superstition Experiment. In the Summer of 1947, renowned behavioral psychologist B.F. Skinner published his study on a group of pigeons that showed even animals are susceptible to the human condition that is superstition.. Skinner conducted his research on a group of hungry pigeons whose body weights had been reduced to 75% of their normal weight when well-fed.

  14. Pigeons, Operant Conditioning, and Social Control

    The pigeons would peck at the target; the pecking in turn would control the missile's tail fins, keeping it on course, via a metal conductor connected to the birds' beak, transmitting the force of the pecking to the missile's guidance system. The pigeons' accuracy, according to Skinner's preliminary tests: nearly perfect.

  15. He Taught a Pigeon to Peck a Ping Pong Ball. What Happened Next Will

    Scientists discovered a way to get the lowly pigeon to play ping-pong. Psychologist Burrhus Skinner (called "B.F." by his BFF's) says the remarkable feat was accomplished in just 3 steps: He rewarded the pigeons when they were near the ball. rewarded them when they pecked the ball. rewarded them when they pecked the ball to the other side ...

  16. B.F. Skinner's Pigeon-Guided Rocket

    Members of the committee were doubtful, but granted Skinner $25,000 to get started. Skinner had already used pigeons in his psychological research, training them to press levers for food. An ...

  17. The Surprising Neuroscience of Pigeon Intelligence

    Another unusual experiment on the amazing visual abilities of pigeons was conducted in 2015: Here, the authors tested whether pigeons were able to discriminate benign from malignant human breast ...

  18. Classics in the History of Psychology -- Skinner (1948)

    A simple experiment demonstrates this to be the case. A pigeon is brought to a stable state of hunger by reducing it to 75 percent of its weight when well fed. It is put into an experimental cage for a few minutes each day. A food hopper attached to the cage may be swung into place so that the pigeon can eat from it.

  19. Pigeon intelligence

    Pigeons have featured in numerous experiments in comparative psychology, including experiments concerned with animal cognition, and as a result there is considerable knowledge of pigeon intelligence.. Available data show [citation needed], for example, that: . Pigeons have the capacity to share attention between different dimensions of a stimulus, but (like humans and other animals) their ...

  20. How pigeons get to be superstitious

    The experiment does, however, show that pigeons have a compulsion to search for pattern in events around them, the same way we do. Via Psychologist World . Animals Biology mad science Psychology

  21. PDF 'SUPERSTITION' IN THE PIGEON

    A simple experiment demonstrates this to be the case. A pigeon is brought to a stable state of hunger by reducing it to 75 percent of its weight when well fed. It is put into an experimental cage for a few minutes each day. ' A food hopper attached to the cage may be swung into place so that the pigeon can eat from it.

  22. How Sunk Cost Fallacy Impacts 2024 Political Decision-Making

    In a 2021 article in the National Review, writer Sean-Michael Pigeon gives numerous examples to support his belief that the sunk cost fallacy exacerbates government deficits. "No member of ...