Stanley Milgram Shock Experiment

Saul McLeod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Learn about our Editorial Process

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Stanley Milgram, a psychologist at Yale University, carried out one of the most famous studies of obedience in psychology.

He conducted an experiment focusing on the conflict between obedience to authority and personal conscience.

Milgram (1963) examined justifications for acts of genocide offered by those accused at the World War II, Nuremberg War Criminal trials. Their defense often was based on obedience  – that they were just following orders from their superiors.

The experiments began in July 1961, a year after the trial of Adolf Eichmann in Jerusalem. Milgram devised the experiment to answer the question:

Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?” (Milgram, 1974).

Milgram (1963) wanted to investigate whether Germans were particularly obedient to authority figures, as this was a common explanation for the Nazi killings in World War II.

Milgram selected participants for his experiment by newspaper advertising for male participants to take part in a study of learning at Yale University.

The procedure was that the participant was paired with another person and they drew lots to find out who would be the ‘learner’ and who would be the ‘teacher.’  The draw was fixed so that the participant was always the teacher, and the learner was one of Milgram’s confederates (pretending to be a real participant).

stanley milgram generator scale

The learner (a confederate called Mr. Wallace) was taken into a room and had electrodes attached to his arms, and the teacher and researcher went into a room next door that contained an electric shock generator and a row of switches marked from 15 volts (Slight Shock) to 375 volts (Danger: Severe Shock) to 450 volts (XXX).

The shocks in Stanley Milgram’s obedience experiments were not real. The “learners” were actors who were part of the experiment and did not actually receive any shocks.

However, the “teachers” (the real participants of the study) believed the shocks were real, which was crucial for the experiment to measure obedience to authority figures even when it involved causing harm to others.

Milgram’s Experiment (1963)

Milgram (1963) was interested in researching how far people would go in obeying an instruction if it involved harming another person.

Stanley Milgram was interested in how easily ordinary people could be influenced into committing atrocities, for example, Germans in WWII.

Volunteers were recruited for a controlled experiment investigating “learning” (re: ethics: deception). 

Participants were 40 males, aged between 20 and 50, whose jobs ranged from unskilled to professional, from the New Haven area. They were paid $4.50 for just turning up.

Milgram

At the beginning of the experiment, they were introduced to another participant, a confederate of the experimenter (Milgram).

They drew straws to determine their roles – learner or teacher – although this was fixed, and the confederate was always the learner. There was also an “experimenter” dressed in a gray lab coat, played by an actor (not Milgram).

Two rooms in the Yale Interaction Laboratory were used – one for the learner (with an electric chair) and another for the teacher and experimenter with an electric shock generator.

Milgram Obedience: Mr Wallace

The “learner” (Mr. Wallace) was strapped to a chair with electrodes.

After he has learned a list of word pairs given to him to learn, the “teacher” tests him by naming a word and asking the learner to recall its partner/pair from a list of four possible choices.

The teacher is told to administer an electric shock every time the learner makes a mistake, increasing the level of shock each time. There were 30 switches on the shock generator marked from 15 volts (slight shock) to 450 (danger – severe shock).

Milgram Obedience IV Variations

The learner gave mainly wrong answers (on purpose), and for each of these, the teacher gave him an electric shock. When the teacher refused to administer a shock, the experimenter was to give a series of orders/prods to ensure they continued.

There were four prods, and if one was not obeyed, then the experimenter (Mr. Williams) read out the next prod, and so on.

Prod 1 : Please continue. Prod 2: The experiment requires you to continue. Prod 3 : It is absolutely essential that you continue. Prod 4 : You have no other choice but to continue.

These prods were to be used in order, and begun afresh for each new attempt at defiance (Milgram, 1974, p. 21). The experimenter also had two special prods available. These could be used as required by the situation:

  • Although the shocks may be painful, there is no permanent tissue damage, so please go on’ (ibid.)
  • ‘Whether the learner likes it or not, you must go on until he has learned all the word pairs correctly. So please go on’ (ibid., p. 22).

65% (two-thirds) of participants (i.e., teachers) continued to the highest level of 450 volts. All the participants continued to 300 volts.

Milgram did more than one experiment – he carried out 18 variations of his study.  All he did was alter the situation (IV) to see how this affected obedience (DV).

Conclusion 

The individual explanation for the behavior of the participants would be that it was something about them as people that caused them to obey, but a more realistic explanation is that the situation they were in influenced them and caused them to behave in the way that they did.

Some aspects of the situation that may have influenced their behavior include the formality of the location, the behavior of the experimenter, and the fact that it was an experiment for which they had volunteered and been paid.

Ordinary people are likely to follow orders given by an authority figure, even to the extent of killing an innocent human being.  Obedience to authority is ingrained in us all from the way we are brought up.

People tend to obey orders from other people if they recognize their authority as morally right and/or legally based. This response to legitimate authority is learned in a variety of situations, for example in the family, school, and workplace.

Milgram summed up in the article “The Perils of Obedience” (Milgram 1974), writing:

“The legal and philosophic aspects of obedience are of enormous import, but they say very little about how most people behave in concrete situations. I set up a simple experiment at Yale University to test how much pain an ordinary citizen would inflict on another person simply because he was ordered to by an experimental scientist. Stark authority was pitted against the subjects’ [participants’] strongest moral imperatives against hurting others, and, with the subjects’ [participants’] ears ringing with the screams of the victims, authority won more often than not. The extreme willingness of adults to go to almost any lengths on the command of an authority constitutes the chief finding of the study and the fact most urgently demanding explanation.”

Milgram’s Agency Theory

Milgram (1974) explained the behavior of his participants by suggesting that people have two states of behavior when they are in a social situation:

  • The autonomous state – people direct their own actions, and they take responsibility for the results of those actions.
  • The agentic state – people allow others to direct their actions and then pass off the responsibility for the consequences to the person giving the orders. In other words, they act as agents for another person’s will.

Milgram suggested that two things must be in place for a person to enter the agentic state:

  • The person giving the orders is perceived as being qualified to direct other people’s behavior. That is, they are seen as legitimate.
  • The person being ordered about is able to believe that the authority will accept responsibility for what happens.
According to Milgram, when in this agentic state, the participant in the obedience studies “defines himself in a social situation in a manner that renders him open to regulation by a person of higher status. In this condition the individual no longer views himself as responsible for his own actions but defines himself as an instrument for carrying out the wishes of others” (Milgram, 1974, p. 134).

Agency theory says that people will obey an authority when they believe that the authority will take responsibility for the consequences of their actions. This is supported by some aspects of Milgram’s evidence.

For example, when participants were reminded that they had responsibility for their own actions, almost none of them were prepared to obey.

In contrast, many participants who were refusing to go on did so if the experimenter said that he would take responsibility.

According to Milgram (1974, p. 188):

“The behavior revealed in the experiments reported here is normal human behavior but revealed under conditions that show with particular clarity the danger to human survival inherent in our make-up.

And what is it we have seen? Not aggression, for there is no anger, vindictiveness, or hatred in those who shocked the victim….

Something far more dangerous is revealed: the capacity for man to abandon his humanity, indeed, the inevitability that he does so, as he merges his unique personality into larger institutional structures.”

Milgram Experiment Variations

The Milgram experiment was carried out many times whereby Milgram (1965) varied the basic procedure (changed the IV).  By doing this Milgram could identify which factors affected obedience (the DV).

Obedience was measured by how many participants shocked to the maximum 450 volts (65% in the original study). Stanley Milgram conducted a total of 23 variations (also called conditions or experiments) of his original obedience study:

In total, 636 participants were tested in 18 variation studies conducted between 1961 and 1962 at Yale University.

In the original baseline study – the experimenter wore a gray lab coat to symbolize his authority (a kind of uniform).

The lab coat worn by the experimenter in the original study served as a crucial symbol of scientific authority that increased obedience. The lab coat conveyed expertise and legitimacy, making participants see the experimenter as more credible and trustworthy.

Milgram carried out a variation in which the experimenter was called away because of a phone call right at the start of the procedure.

The role of the experimenter was then taken over by an ‘ordinary member of the public’ ( a confederate) in everyday clothes rather than a lab coat. The obedience level dropped to 20%.

Change of Location:  The Mountain View Facility Study (1963, unpublished)

Milgram conducted this variation in a set of offices in a rundown building, claiming it was associated with “Research Associates of Bridgeport” rather than Yale.

The lab’s ordinary appearance was designed to test if Yale’s prestige encouraged obedience. Participants were led to believe that a private research firm experimented.

In this non-university setting, obedience rates dropped to 47.5% compared to 65% in the original Yale experiments. This suggests that the status of location affects obedience.

Private research firms are viewed as less prestigious than certain universities, which affects behavior. It is easier under these conditions to abandon the belief in the experimenter’s essential decency.

The impressive university setting reinforced the experimenter’s authority and conveyed an implicit approval of the research.

Milgram filmed this variation for his documentary Obedience , but did not publish the results in his academic papers. The study only came to wider light when archival materials, including his notes, films, and data, were studied by later researchers like Perry (2013) in the decades after Milgram’s death.

Two Teacher Condition

When participants could instruct an assistant (confederate) to press the switches, 92.5% shocked to the maximum of 450 volts.

Allowing the participant to instruct an assistant to press the shock switches diffused personal responsibility and likely reduced perceptions of causing direct harm.

By attributing the actions to the assistant rather than themselves, participants could more easily justify shocking to the maximum 450 volts, reflected in the 92.5% obedience rate.

When there is less personal responsibility, obedience increases. This relates to Milgram’s Agency Theory.

Touch Proximity Condition

The teacher had to force the learner’s hand down onto a shock plate when the learner refused to participate after 150 volts. Obedience fell to 30%.

Forcing the learner’s hand onto the shock plate after 150 volts physically connected the teacher to the consequences of their actions. This direct tactile feedback increased the teacher’s personal responsibility.

No longer shielded from the learner’s reactions, the proximity enabled participants to more clearly perceive the harm they were causing, reducing obedience to 30%. Physical distance and indirect actions in the original setup made it easier to rationalize obeying the experimenter.

The participant is no longer buffered/protected from seeing the consequences of their actions.

Social Support Condition

When the two confederates set an example of defiance by refusing to continue the shocks, especially early on at 150 volts, it permitted the real participant also to resist authority.

Two other participants (confederates) were also teachers but refused to obey. Confederate 1 stopped at 150 volts, and Confederate 2 stopped at 210 volts.

Their disobedience provided social proof that it was acceptable to disobey. This modeling of defiance lowered obedience to only 10% compared to 65% without such social support. It demonstrated that social modeling can validate challenging authority.

The presence of others who are seen to disobey the authority figure reduces the level of obedience to 10%.

Absent Experimenter Condition 

It is easier to resist the orders from an authority figure if they are not close by. When the experimenter instructed and prompted the teacher by telephone from another room, obedience fell to 20.5%.

Many participants cheated and missed out on shocks or gave less voltage than ordered by the experimenter. The proximity of authority figures affects obedience.

The physical absence of the authority figure enabled participants to act more freely on their own moral inclinations rather than the experimenter’s commands. This highlighted the role of an authority’s direct presence in influencing behavior.

A key reason the obedience studies fascinate people is Milgram presented them as a scientific experiment, contrasting himself as an “empirically grounded scientist” compared to philosophers. He claimed he systematically varied factors to alter obedience rates.

However, recent scholarship using archival records shows Milgram’s account of standardizing the procedure was misleading. For example, he published a list of standardized prods the experimenter used when participants questioned continuing. Milgram said these were delivered uniformly in a firm but polite tone.

Analyzing audiotapes, Gibson (2013) found considerable variation from the published protocol – the prods differed across trials. The point is not that Milgram did poor science, but that the archival materials reveal the limitations of the textbook account of his “standardized” procedure.

The qualitative data like participant feedback, Milgram’s notes, and researchers’ actions provide a fuller, messier picture than the obedience studies’ “official” story. For psychology students, this shows how scientific reporting can polish findings in a way that strays from the less tidy reality.

Critical Evaluation

Inaccurate description of the prod methodology:.

A key reason the obedience studies fascinate people is Milgram (1974) presented them as a scientific experiment, contrasting himself as an “empirically grounded scientist” compared to philosophers. He claimed he systematically varied factors to alter obedience rates.

However, recent scholarship using archival records shows Milgram’s account of standardizing the procedure was misleading. For example, he published a list of standardized prods the experimenter used when participants questioned continuing. Milgram said these were delivered uniformly in a firm but polite tone (Gibson, 2013; Perry, 2013; Russell, 2010).

Perry’s (2013) archival research revealed another discrepancy between Milgram’s published account and the actual events. Milgram claimed standardized prods were used when participants resisted, but Perry’s audiotape analysis showed the experimenter often improvised more coercive prods beyond the supposed script.

This off-script prodding varied between experiments and participants, and was especially prevalent with female participants where no gender obedience difference was found – suggesting the improvisation influenced results. Gibson (2013) and Russell (2009) corroborated the experimenter’s departures from the supposed fixed prods. 

Prods were often combined or modified rather than used verbatim as published.

Russell speculated the improvisation aimed to achieve outcomes the experimenter believed Milgram wanted. Milgram seemed to tacitly approve of the deviations by not correcting them when observing.

This raises significant issues around experimenter bias influencing results, lack of standardization compromising validity, and ethical problems with Milgram misrepresenting procedures.

Milgram’s experiment lacked external validity:

The Milgram studies were conducted in laboratory-type conditions, and we must ask if this tells us much about real-life situations.

We obey in a variety of real-life situations that are far more subtle than instructions to give people electric shocks, and it would be interesting to see what factors operate in everyday obedience. The sort of situation Milgram investigated would be more suited to a military context.

Orne and Holland (1968) accused Milgram’s study of lacking ‘experimental realism,”’ i.e.,” participants might not have believed the experimental set-up they found themselves in and knew the learner wasn’t receiving electric shocks.

“It’s more truthful to say that only half of the people who undertook the experiment fully believed it was real, and of those two-thirds disobeyed the experimenter,” observes Perry (p. 139).

Milgram’s sample was biased:

  • The participants in Milgram’s study were all male. Do the findings transfer to females?
  • Milgram’s study cannot be seen as representative of the American population as his sample was self-selected. This is because they became participants only by electing to respond to a newspaper advertisement (selecting themselves).
  • They may also have a typical “volunteer personality” – not all the newspaper readers responded so perhaps it takes this personality type to do so.

Yet a total of 636 participants were tested in 18 separate experiments across the New Haven area, which was seen as being reasonably representative of a typical American town.

Milgram’s findings have been replicated in a variety of cultures and most lead to the same conclusions as Milgram’s original study and in some cases see higher obedience rates.

However, Smith and Bond (1998) point out that with the exception of Jordan (Shanab & Yahya, 1978), the majority of these studies have been conducted in industrialized Western cultures, and we should be cautious before we conclude that a universal trait of social behavior has been identified.

Selective reporting of experimental findings:

Perry (2013) found Milgram omitted findings from some obedience experiments he conducted, reporting only results supporting his conclusions. A key omission was the Relationship condition (conducted in 1962 but unpublished), where participant pairs were relatives or close acquaintances.

When the learner protested being shocked, most teachers disobeyed, contradicting Milgram’s emphasis on obedience to authority.

Perry argued Milgram likely did not publish this 85% disobedience rate because it undermined his narrative and would be difficult to defend ethically since the teacher and learner knew each other closely.

Milgram’s selective reporting biased interpretations of his findings. His failure to publish all his experiments raises issues around researchers’ ethical obligation to completely and responsibly report their results, not just those fitting their expectations.

Unreported analysis of participants’ skepticism and its impact on their behavior:

Perry (2013) found archival evidence that many participants expressed doubt about the experiment’s setup, impacting their behavior. This supports Orne and Holland’s (1968) criticism that Milgram overlooked participants’ perceptions.

Incongruities like apparent danger, but an unconcerned experimenter likely cued participants that no real harm would occur. Trust in Yale’s ethics reinforced this. Yet Milgram did not publish his assistant’s analysis showing participant skepticism correlated with disobedience rates and varied by condition.

Obedient participants were more skeptical that the learner was harmed. This selective reporting biased interpretations. Additional unreported findings further challenge Milgram’s conclusions.

This highlights issues around thoroughly and responsibly reporting all results, not just those fitting expectations. It shows how archival evidence makes Milgram’s study a contentious classic with questionable methods and conclusions.

Ethical Issues

What are the potential ethical concerns associated with Milgram’s research on obedience?

While not a “contribution to psychology” in the traditional sense, Milgram’s obedience experiments sparked significant debate about the ethics of psychological research.

Baumrind (1964) criticized the ethics of Milgram’s research as participants were prevented from giving their informed consent to take part in the study. 

Participants assumed the experiment was benign and expected to be treated with dignity.

As a result of studies like Milgram’s, the APA and BPS now require researchers to give participants more information before they agree to take part in a study.

The participants actually believed they were shocking a real person and were unaware the learner was a confederate of Milgram’s.

However, Milgram argued that “illusion is used when necessary in order to set the stage for the revelation of certain difficult-to-get-at-truths.”

Milgram also interviewed participants afterward to find out the effect of the deception. Apparently, 83.7% said that they were “glad to be in the experiment,” and 1.3% said that they wished they had not been involved.

Protection of participants 

Participants were exposed to extremely stressful situations that may have the potential to cause psychological harm. Many of the participants were visibly distressed (Baumrind, 1964).

Signs of tension included trembling, sweating, stuttering, laughing nervously, biting lips and digging fingernails into palms of hands. Three participants had uncontrollable seizures, and many pleaded to be allowed to stop the experiment.

Milgram described a businessman reduced to a “twitching stuttering wreck” (1963, p. 377),

In his defense, Milgram argued that these effects were only short-term. Once the participants were debriefed (and could see the confederate was OK), their stress levels decreased.

“At no point,” Milgram (1964) stated, “were subjects exposed to danger and at no point did they run the risk of injurious effects resulting from participation” (p. 849).

To defend himself against criticisms about the ethics of his obedience research, Milgram cited follow-up survey data showing that 84% of participants said they were glad they had taken part in the study.

Milgram used this to claim that the study caused no serious or lasting harm, since most participants retrospectively did not regret their involvement.

Yet archival accounts show many participants endured lasting distress, even trauma, refuting Milgram’s insistence the study caused only fleeting “excitement.” By not debriefing all, Milgram misled participants about the true risks involved (Perry, 2013).

However, Milgram did debrief the participants fully after the experiment and also followed up after a period of time to ensure that they came to no harm.

Milgram debriefed all his participants straight after the experiment and disclosed the true nature of the experiment.

Participants were assured that their behavior was common, and Milgram also followed the sample up a year later and found no signs of any long-term psychological harm.

The majority of the participants (83.7%) said that they were pleased that they had participated, and 74% had learned something of personal importance.

Perry’s (2013) archival research found Milgram misrepresented debriefing – around 600 participants were not properly debriefed soon after the study, contrary to his claims. Many only learned no real shocks occurred when reading a mailed study report months later, which some may have not received.

Milgram likely misreported debriefing details to protect his credibility and enable future obedience research. This raises issues around properly informing and debriefing participants that connect to APA ethics codes developed partly in response to Milgram’s study.

Right to Withdrawal 

The BPS states that researchers should make it plain to participants that they are free to withdraw at any time (regardless of payment).

When expressing doubts, the experimenter assured them all was well. Trusting Yale scientists, many took the experimenter at his word that “no permanent tissue damage” would occur, and continued administering shocks despite reservations.

Did Milgram give participants an opportunity to withdraw? The experimenter gave four verbal prods which mostly discouraged withdrawal from the experiment:

  • Please continue.
  • The experiment requires that you continue.
  • It is absolutely essential that you continue.
  • You have no other choice, you must go on.

Milgram argued that they were justified as the study was about obedience, so orders were necessary.

Milgram pointed out that although the right to withdraw was made partially difficult, it was possible as 35% of participants had chosen to withdraw.

Replications

Direct replications have not been possible due to current ethical standards . However, several researchers have conducted partial replications and variations that aim to reproduce some aspects of Milgram’s methods ethically.

One important replication was conducted by Jerry Burger in 2009. Burger’s partial replication included several safeguards to protect participant welfare, such as screening out high-risk individuals, repeatedly reminding participants they could withdraw, and stopping at the 150-volt shock level. This was the point where Milgram’s participants first heard the learner’s protests.

As 79% of Milgram’s participants who went past 150 volts continued to the maximum 450 volts, Burger (2009) argued that 150 volts provided a reasonable estimate for obedience levels. He found 70% of participants continued to 150 volts, compared to 82.5% in Milgram’s comparable condition.

Another replication by Thomas Blass (1999) examined whether obedience rates had declined over time due to greater public awareness of the experiments. Blass correlated obedience rates from replication studies between 1963 and 1985 and found no relationship between year and obedience level. He concluded that obedience rates have not systematically changed, providing evidence against the idea of “enlightenment effects”.

Some variations have explored the role of gender. Milgram found equal rates of obedience for male and female participants. Reviews have found most replications also show no gender difference, with a couple of exceptions (Blass, 1999). For example, Kilham and Mann (1974) found lower obedience in female participants.

Partial replications have also examined situational factors. Having another person model defiance reduced obedience compared to a solo participant in one study, but did not eliminate it (Burger, 2009). The authority figure’s perceived expertise seems to be an influential factor (Blass, 1999). Replications have supported Milgram’s observation that stepwise increases in demands promote obedience.

Personality factors have been studied as well. Traits like high empathy and desire for control correlate with some minor early hesitation, but do not greatly impact eventual obedience levels (Burger, 2009). Authoritarian tendencies may contribute to obedience (Elms, 2009).

In sum, the partial replications confirm Milgram’s degree of obedience. Though ethical constraints prevent full reproductions, the key elements of his procedure seem to consistently elicit high levels of compliance across studies, samples, and eras. The replications continue to highlight the power of situational pressures to yield obedience.

Milgram (1963) Audio Clips

Below you can also hear some of the audio clips taken from the video that was made of the experiment. Just click on the clips below.

Why was the Milgram experiment so controversial?

The Milgram experiment was controversial because it revealed people’s willingness to obey authority figures even when causing harm to others, raising ethical concerns about the psychological distress inflicted upon participants and the deception involved in the study.

Would Milgram’s experiment be allowed today?

Milgram’s experiment would likely not be allowed today in its original form, as it violates modern ethical guidelines for research involving human participants, particularly regarding informed consent, deception, and protection from psychological harm.

Did anyone refuse the Milgram experiment?

Yes, in the Milgram experiment, some participants refused to continue administering shocks, demonstrating individual variation in obedience to authority figures. In the original Milgram experiment, approximately 35% of participants refused to administer the highest shock level of 450 volts, while 65% obeyed and delivered the 450-volt shock.

How can Milgram’s study be applied to real life?

Milgram’s study can be applied to real life by demonstrating the potential for ordinary individuals to obey authority figures even when it involves causing harm, emphasizing the importance of questioning authority, ethical decision-making, and fostering critical thinking in societal contexts.

Were all participants in Milgram’s experiments male?

Yes, in the original Milgram experiment conducted in 1961, all participants were male, limiting the generalizability of the findings to women and diverse populations.

Why was the Milgram experiment unethical?

The Milgram experiment was considered unethical because participants were deceived about the true nature of the study and subjected to severe emotional distress. They believed they were causing harm to another person under the instruction of authority.

Additionally, participants were not given the right to withdraw freely and were subjected to intense pressure to continue. The psychological harm and lack of informed consent violates modern ethical guidelines for research.

Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s” Behavioral study of obedience.”.  American Psychologist ,  19 (6), 421.

Blass, T. (1999). The Milgram paradigm after 35 years: Some things we now know about obedience to authority 1.  Journal of Applied Social Psychology ,  29 (5), 955-978.

Brannigan, A., Nicholson, I., & Cherry, F. (2015). Introduction to the special issue: Unplugging the Milgram machine.  Theory & Psychology ,  25 (5), 551-563.

Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64 , 1–11.

Elms, A. C. (2009). Obedience lite. American Psychologist, 64 (1), 32–36.

Gibson, S. (2013). Milgram’s obedience experiments: A rhetorical analysis. British Journal of Social Psychology, 52, 290–309.

Gibson, S. (2017). Developing psychology’s archival sensibilities: Revisiting Milgram’s obedience’ experiments.  Qualitative Psychology ,  4 (1), 73.

Griggs, R. A., Blyler, J., & Jackson, S. L. (2020). Using research ethics as a springboard for teaching Milgram’s obedience study as a contentious classic.  Scholarship of Teaching and Learning in Psychology ,  6 (4), 350.

Haslam, S. A., & Reicher, S. D. (2018). A truth that does not always speak its name: How Hollander and Turowetz’s findings confirm and extend the engaged followership analysis of harm-doing in the Milgram paradigm. British Journal of Social Psychology, 57, 292–300.

Haslam, S. A., Reicher, S. D., & Birney, M. E. (2016). Questioning authority: New perspectives on Milgram’s ‘obedience’ research and its implications for intergroup relations. Current Opinion in Psychology, 11 , 6–9.

Haslam, S. A., Reicher, S. D., Birney, M. E., Millard, K., & McDonald, R. (2015). ‘Happy to have been of service’: The Yale archive as a window into the engaged followership of participants in Milgram’s ‘obedience’ experiment. British Journal of Social Psychology, 54 , 55–83.

Kaplan, D. E. (1996). The Stanley Milgram papers: A case study on appraisal of and access to confidential data files. American Archivist, 59 , 288–297.

Kaposi, D. (2022). The second wave of critical engagement with Stanley Milgram’s ‘obedience to authority’experiments: What did we learn?.  Social and Personality Psychology Compass ,  16 (6), e12667.

Kilham, W., & Mann, L. (1974). Level of destructive obedience as a function of transmitter and executant roles in the Milgram obedience paradigm. Journal of Personality and Social Psychology, 29 (5), 696–702.

Milgram, S. (1963). Behavioral study of obedience . Journal of Abnormal and Social Psychology , 67, 371-378.

Milgram, S. (1964). Issues in the study of obedience: A reply to Baumrind. American Psychologist, 19 , 848–852.

Milgram, S. (1965). Some conditions of obedience and disobedience to authority . Human Relations, 18(1) , 57-76.

Milgram, S. (1974). Obedience to authority: An experimental view . Harpercollins.

Miller, A. G. (2009). Reflections on” Replicating Milgram”(Burger, 2009), American Psychologis t, 64 (1):20-27

Nicholson, I. (2011). “Torture at Yale”: Experimental subjects, laboratory torment and the “rehabilitation” of Milgram’s “obedience to authority”. Theory & Psychology, 21 , 737–761.

Nicholson, I. (2015). The normalization of torment: Producing and managing anguish in Milgram’s “obedience” laboratory. Theory & Psychology, 25 , 639–656.

Orne, M. T., & Holland, C. H. (1968). On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6 (4), 282-293.

Orne, M. T., & Holland, C. C. (1968). Some conditions of obedience and disobedience to authority. On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6 , 282–293.

Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments . New York, NY: The New Press.

Reicher, S., Haslam, A., & Miller, A. (Eds.). (2014). Milgram at 50: Exploring the enduring relevance of psychology’s most famous studies [Special issue]. Journal of Social Issues, 70 (3), 393–602

Russell, N. (2014). Stanley Milgram’s obedience to authority “relationship condition”: Some methodological and theoretical implications. Social Sciences, 3, 194–214

Shanab, M. E., & Yahya, K. A. (1978). A cross-cultural study of obedience. Bulletin of the Psychonomic Society .

Smith, P. B., & Bond, M. H. (1998). Social psychology across cultures (2nd Edition) . Prentice Hall.

Further Reading

  • The power of the situation: The impact of Milgram’s obedience studies on personality and social psychology
  • Seeing is believing: The role of the film Obedience in shaping perceptions of Milgram’s Obedience to Authority Experiments
  • Replicating Milgram: Would people still obey today?

Learning Check

Which is true regarding the Milgram obedience study?
  • The aim was to see how obedient people would be in a situation where following orders would mean causing harm to another person.
  • Participants were under the impression they were part of a learning and memory experiment.
  • The “learners” in the study were actual participants who volunteered to be shocked as part of the experiment.
  • The “learner” was an actor who was in on the experiment and never actually received any real shocks.
  • Although the participant could not see the “learner”, he was able to hear him clearly through the wall
  • The study was directly influenced by Milgram’s observations of obedience patterns in post-war Europe.
  • The experiment was designed to understand the psychological mechanisms behind war crimes committed during World War II.
  • The Milgram study was universally accepted in the psychological community, and no ethical concerns were raised about its methodology.
  • When Milgram’s experiment was repeated in a rundown office building in Bridgeport, the percentage of the participants who fully complied with the commands of the experimenter remained unchanged.
  • The experimenter (authority figure) delivered verbal prods to encourage the teacher to continue, such as ‘Please continue’ or ‘Please go on’.
  • Over 80% of participants went on to deliver the maximum level of shock.
  • Milgram sent participants questionnaires after the study to assess the effects and found that most felt no remorse or guilt, so it was ethical.
  • The aftermath of the study led to stricter ethical guidelines in psychological research.
  • The study emphasized the role of situational factors over personality traits in determining obedience.

Answers : Items 3, 8, 9, and 11 are the false statements.

Short Answer Questions
  • Briefly explain the results of the original Milgram experiments. What did these results prove?
  • List one scenario on how an authority figure can abuse obedience principles.
  • List one scenario on how an individual could use these principles to defend their fellow peers.
  • In a hospital, you are very likely to obey a nurse. However, if you meet her outside the hospital, for example in a shop, you are much less likely to obey. Using your knowledge of how people resist pressure to obey, explain why you are less likely to obey the nurse outside the hospital.
  • Describe the shock instructions the participant (teacher) was told to follow when the victim (learner) gave an incorrect answer.
  • State the lowest voltage shock that was labeled on the shock generator.
  • What would likely happen if Milgram’s experiment included a condition in which the participant (teacher) had to give a high-level electric shock for the first wrong answer?
Group Activity

Gather in groups of three or four to discuss answers to the short answer questions above.

For question 2, review the different scenarios you each came up with. Then brainstorm on how these situations could be flipped.

For question 2, discuss how an authority figure could instead empower those below them in the examples your groupmates provide.

For question 3, discuss how a peer could do harm by using the obedience principles in the scenarios your groupmates provide.

Essay Topic
  • What’s the most important lesson of Milgram’s Obedience Experiments? Fully explain and defend your answer.
  • Milgram selectively edited his film of the obedience experiments to emphasize obedient behavior and minimize footage of disobedience. What are the ethical implications of a researcher selectively presenting findings in a way that fits their expected conclusions?

Print Friendly, PDF & Email

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Sweepstakes
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Understanding the Milgram Experiment in Psychology

A closer look at Milgram's controversial studies of obedience

Isabelle Adam (CC BY-NC-ND 2.0) via Flickr

Factors That Influence Obedience

  • Ethical Concerns
  • Replications

How far do you think people would go to obey an authority figure? Would they refuse to obey if the order went against their values or social expectations? Those questions were at the heart of an infamous and controversial study known as the Milgram obedience experiments.

Yale University  psychologist   Stanley Milgram  conducted these experiments during the 1960s. They explored the effects of authority on obedience. In the experiments, an authority figure ordered participants to deliver what they believed were dangerous electrical shocks to another person. These results suggested that people are highly influenced by authority and highly obedient . More recent investigations cast doubt on some of the implications of Milgram's findings and even the results and procedures themselves. Despite its problems, the study has, without question, made a significant impact on psychology .

At a Glance

Milgram's experiments posed the question: Would people obey orders, even if they believed doing so would harm another person? Milgram's findings suggested the answer was yes, they would. The experiments have long been controversial, both because of the startling findings and the ethical problems with the research. More recently, experts have re-examined the studies, suggesting that participants were often coerced into obeying and that at least some participants recognized that the other person was just pretending to be shocked. Such findings call into question the study's validity and authenticity, but some replications suggest that people are surprisingly prone to obeying authority.

History of the Milgram Experiments

Milgram started his experiments in 1961, shortly after the trial of the World War II criminal Adolf Eichmann had begun. Eichmann’s defense that he was merely following instructions when he ordered the deaths of millions of Jews roused Milgram’s interest.

In his 1974 book "Obedience to Authority," Milgram posed the question, "Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?"

Procedure in the Milgram Experiment

The participants in the most famous variation of the Milgram experiment were 40 men recruited using newspaper ads. In exchange for their participation, each person was paid $4.50.

Milgram developed an intimidating shock generator, with shock levels starting at 15 volts and increasing in 15-volt increments all the way up to 450 volts. The many switches were labeled with terms including "slight shock," "moderate shock," and "danger: severe shock." The final three switches were labeled simply with an ominous "XXX."

Each participant took the role of a "teacher" who would then deliver a shock to the "student" in a neighboring room whenever an incorrect answer was given. While participants believed that they were delivering real shocks to the student, the “student” was a confederate in the experiment who was only pretending to be shocked.

As the experiment progressed, the participant would hear the learner plead to be released or even complain about a heart condition. Once they reached the 300-volt level, the learner would bang on the wall and demand to be released.

Beyond this point, the learner became completely silent and refused to answer any more questions. The experimenter then instructed the participant to treat this silence as an incorrect response and deliver a further shock.

Most participants asked the experimenter whether they should continue. The experimenter then responded with a series of commands to prod the participant along:

  • "Please continue."
  • "The experiment requires that you continue."
  • "It is absolutely essential that you continue."
  • "You have no other choice; you must go on."

Results of the Milgram Experiment

In the Milgram experiment, obedience was measured by the level of shock that the participant was willing to deliver. While many of the subjects became extremely agitated, distraught, and angry at the experimenter, they nevertheless continued to follow orders all the way to the end.

Milgram's results showed that 65% of the participants in the study delivered the maximum shocks. Of the 40 participants in the study, 26 delivered the maximum shocks, while 14 stopped before reaching the highest levels.

Why did so many of the participants in this experiment perform a seemingly brutal act when instructed by an authority figure? According to Milgram, there are some situational factors that can explain such high levels of obedience:

  • The physical presence of an authority figure dramatically increased compliance .
  • The fact that Yale (a trusted and authoritative academic institution) sponsored the study led many participants to believe that the experiment must be safe.
  • The selection of teacher and learner status seemed random.
  • Participants assumed that the experimenter was a competent expert.
  • The shocks were said to be painful, not dangerous.

Later experiments conducted by Milgram indicated that the presence of rebellious peers dramatically reduced obedience levels. When other people refused to go along with the experimenter's orders, 36 out of 40 participants refused to deliver the maximum shocks.

More recent work by researchers suggests that while people do tend to obey authority figures, the process is not necessarily as cut-and-dried as Milgram depicted it.

In a 2012 essay published in PLoS Biology , researchers suggested that the degree to which people are willing to obey the questionable orders of an authority figure depends largely on two key factors:

  • How much the individual agrees with the orders
  • How much they identify with the person giving the orders

While it is clear that people are often far more susceptible to influence, persuasion , and obedience than they would often like to be, they are far from mindless machines just taking orders. 

Another study that analyzed Milgram's results concluded that eight factors influenced the likelihood that people would progress up to the 450-volt shock:

  • The experimenter's directiveness
  • Legitimacy and consistency
  • Group pressure to disobey
  • Indirectness of proximity
  • Intimacy of the relation between the teacher and learner
  • Distance between the teacher and learner

Ethical Concerns in the Milgram Experiment

Milgram's experiments have long been the source of considerable criticism and controversy. From the get-go, the ethics of his experiments were highly dubious. Participants were subjected to significant psychological and emotional distress.

Some of the major ethical issues in the experiment were related to:

  • The use of deception
  • The lack of protection for the participants who were involved
  • Pressure from the experimenter to continue even after asking to stop, interfering with participants' right to withdraw

Due to concerns about the amount of anxiety experienced by many of the participants, everyone was supposedly debriefed at the end of the experiment. The researchers reported that they explained the procedures and the use of deception.

Critics of the study have argued that many of the participants were still confused about the exact nature of the experiment, and recent findings suggest that many participants were not debriefed at all.

Replications of the Milgram Experiment

While Milgram’s research raised serious ethical questions about the use of human subjects in psychology experiments , his results have also been consistently replicated in further experiments. One review further research on obedience and found that Milgram’s findings hold true in other experiments. In one study, researchers conducted a study designed to replicate Milgram's classic obedience experiment. The researchers made several alterations to Milgram's experiment.

  • The maximum shock level was 150 volts as opposed to the original 450 volts.
  • Participants were also carefully screened to eliminate those who might experience adverse reactions to the experiment.

The results of the new experiment revealed that participants obeyed at roughly the same rate that they did when Milgram conducted his original study more than 40 years ago.

Some psychologists suggested that in spite of the changes made in the replication, the study still had merit and could be used to further explore some of the situational factors that also influenced the results of Milgram's study. But other psychologists suggested that the replication was too dissimilar to Milgram's original study to draw any meaningful comparisons.

One study examined people's beliefs about how they would do compared to the participants in Milgram's experiments. They found that most people believed they would stop sooner than the average participants. These findings applied to both those who had never heard of Milgram's experiments and those who were familiar with them. In fact, those who knew about Milgram's experiments actually believed that they would stop even sooner than other people.

Another novel replication involved recruiting participants in pairs and having them take turns acting as either an 'agent' or 'victim.' Agents then received orders to shock the victim. The results suggest that only around 3.3% disobeyed the experimenter's orders.

Recent Criticisms and New Findings

Psychologist Gina Perry suggests that much of what we think we know about Milgram's famous experiments is only part of the story. While researching an article on the topic, she stumbled across hundreds of audiotapes found in Yale archives that documented numerous variations of Milgram's shock experiments.

Participants Were Often Coerced

While Milgram's reports of his process report methodical and uniform procedures, the audiotapes reveal something different. During the experimental sessions, the experimenters often went off-script and coerced the subjects into continuing the shocks.

"The slavish obedience to authority we have come to associate with Milgram’s experiments comes to sound much more like bullying and coercion when you listen to these recordings," Perry suggested in an article for Discover Magazine .

Few Participants Were Really Debriefed

Milgram suggested that the subjects were "de-hoaxed" after the experiments. He claimed he later surveyed the participants and found that 84% were glad to have participated, while only 1% regretted their involvement.

However, Perry's findings revealed that of the 700 or so people who took part in different variations of his studies between 1961 and 1962, very few were truly debriefed.

A true debriefing would have involved explaining that the shocks weren't real and that the other person was not injured. Instead, Milgram's sessions were mainly focused on calming the subjects down before sending them on their way.

Many participants left the experiment in a state of considerable distress. While the truth was revealed to some months or even years later, many were simply never told a thing.

Variations Led to Differing Results

Another problem is that the version of the study presented by Milgram and the one that's most often retold does not tell the whole story. The statistic that 65% of people obeyed orders applied only to one variation of the experiment, in which 26 out of 40 subjects obeyed.

In other variations, far fewer people were willing to follow the experimenters' orders, and in some versions of the study, not a single participant obeyed.

Participants Guessed the Learner Was Faking

Perry even tracked down some of the people who took part in the experiments, as well as Milgram's research assistants. What she discovered is that many of his subjects had deduced what Milgram's intent was and knew that the "learner" was merely pretending.

Such findings cast Milgram's results in a new light. It suggests that not only did Milgram intentionally engage in some hefty misdirection to obtain the results he wanted but that many of his participants were simply playing along.

An analysis of an unpublished study by Milgram's assistant, Taketo Murata, found that participants who believed they were really delivering a shock were less likely to obey, while those who did not believe they were actually inflicting pain were more willing to obey. In other words, the perception of pain increased defiance, while skepticism of pain increased obedience.

A review of Milgram's research materials suggests that the experiments exerted more pressure to obey than the original results suggested. Other variations of the experiment revealed much lower rates of obedience, and many of the participants actually altered their behavior when they guessed the true nature of the experiment.

Impact of the Milgram Experiment

Since there is no way to truly replicate the experiment due to its serious ethical and moral problems, determining whether Milgram's experiment really tells us anything about the power of obedience is impossible to determine.

So why does Milgram's experiment maintain such a powerful hold on our imaginations, even decades after the fact? Perry believes that despite all its ethical issues and the problem of never truly being able to replicate Milgram's procedures, the study has taken on the role of what she calls a "powerful parable."

Milgram's work might not hold the answers to what makes people obey or even the degree to which they truly obey. It has, however, inspired other researchers to explore what makes people follow orders and, perhaps more importantly, what leads them to question authority.

Recent findings undermine the scientific validity of the study. Milgram's work is also not truly replicable due to its ethical problems. However, the study has led to additional research on how situational factors can affect obedience to authority.

Milgram’s experiment has become a classic in psychology , demonstrating the dangers of obedience. The research suggests that situational variables have a stronger sway than personality factors in determining whether people will obey an authority figure. However, other psychologists argue that both external and internal factors heavily influence obedience, such as personal beliefs and overall temperament.

Milgram S.  Obedience to Authority: An Experimental View.  Harper & Row.

Russell N, Gregory R. The Milgram-Holocaust linkage: challenging the present consensus . State Crim J. 2015;4(2):128-153.

Russell NJC. Milgram's obedience to authority experiments: origins and early evolution . Br J Soc Psychol . 2011;50:140-162. doi:10.1348/014466610X492205

Haslam SA, Reicher SD. Contesting the "nature" of conformity: What Milgram and Zimbardo's studies really show . PLoS Biol. 2012;10(11):e1001426. doi:10.1371/journal.pbio.1001426

Milgram S. Liberating effects of group pressure . J Person Soc Psychol. 1965;1(2):127-234. doi:10.1037/h0021650

Haslam N, Loughnan S, Perry G. Meta-Milgram: an empirical synthesis of the obedience experiments .  PLoS One . 2014;9(4):e93927. doi:10.1371/journal.pone.0093927

Perry G. Deception and illusion in Milgram's accounts of the obedience experiments . Theory Appl Ethics . 2013;2(2):79-92.

Blass T. The Milgram paradigm after 35 years: some things we now know about obedience to authority . J Appl Soc Psychol. 1999;29(5):955-978. doi:10.1111/j.1559-1816.1999.tb00134.x

Burger J. Replicating Milgram: Would people still obey today? . Am Psychol . 2009;64(1):1-11. doi:10.1037/a0010932

Elms AC. Obedience lite . American Psychologist . 2009;64(1):32-36. doi:10.1037/a0014473

Miller AG. Reflections on “replicating Milgram” (Burger, 2009) . American Psychologist . 2009;64(1):20-27. doi:10.1037/a0014407

Grzyb T, Dolinski D. Beliefs about obedience levels in studies conducted within the Milgram paradigm: Better than average effect and comparisons of typical behaviors by residents of various nations .  Front Psychol . 2017;8:1632. doi:10.3389/fpsyg.2017.01632

Caspar EA. A novel experimental approach to study disobedience to authority .  Sci Rep . 2021;11(1):22927. doi:10.1038/s41598-021-02334-8

Haslam SA, Reicher SD, Millard K, McDonald R. ‘Happy to have been of service’: The Yale archive as a window into the engaged followership of participants in Milgram’s ‘obedience’ experiments . Br J Soc Psychol . 2015;54:55-83. doi:10.1111/bjso.12074

Perry G, Brannigan A, Wanner RA, Stam H. Credibility and incredulity in Milgram’s obedience experiments: A reanalysis of an unpublished test . Soc Psychol Q . 2020;83(1):88-106. doi:10.1177/0190272519861952

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

The Milgram Experiment: How Far Will You Go to Obey an Order?

Understand the infamous study and its conclusions about human nature

  • Archaeology
  • Ph.D., Psychology, University of California - Santa Barbara
  • B.A., Psychology and Peace & Conflict Studies, University of California - Berkeley

A brief Milgram experiment summary is as follows: In the 1960s, psychologist Stanley Milgram conducted a series of studies on the concepts of obedience and authority. His experiments involved instructing study participants to deliver increasingly high-voltage shocks to an actor in another room, who would scream and eventually go silent as the shocks became stronger. The shocks weren't real, but study participants were made to believe that they were.

Today, the Milgram experiment is widely criticized on both ethical and scientific grounds. However, Milgram's conclusions about humanity's willingness to obey authority figures remain influential and well-known.

Key Takeaways: The Milgram Experiment

  • The goal of the Milgram experiment was to test the extent of humans' willingness to obey orders from an authority figure.
  • Participants were told by an experimenter to administer increasingly powerful electric shocks to another individual. Unbeknownst to the participants, shocks were fake and the individual being shocked was an actor.
  • The majority of participants obeyed, even when the individual being shocked screamed in pain.
  • The experiment has been widely criticized on ethical and scientific grounds.

Detailed Milgram’s Experiment Summary

In the most well-known version of the Milgram experiment, the 40 male participants were told that the experiment focused on the relationship between punishment, learning, and memory. The experimenter then introduced each participant to a second individual, explaining that this second individual was participating in the study as well. Participants were told that they would be randomly assigned to roles of "teacher" and "learner." However, the "second individual" was an actor hired by the research team, and the study was set up so that the true participant would always be assigned to the "teacher" role.

During the Milgram experiment, the learner was located in a separate room from the teacher (the real participant), but the teacher could hear the learner through the wall. The experimenter told the teacher that the learner would memorize word pairs and instructed the teacher to ask the learner questions. If the learner responded incorrectly to a question, the teacher would be asked to administer an electric shock. The shocks started at a relatively mild level (15 volts) but increased in 15-volt increments up to 450 volts. (In actuality, the shocks were fake, but the participant was led to believe they were real.)

Participants were instructed to give a higher shock to the learner with each wrong answer. When the 150-volt shock was administered, the learner would cry out in pain and ask to leave the study. He would then continue crying out with each shock until the 330-volt level, at which point he would stop responding.

During this process, whenever participants expressed hesitation about continuing with the study, the experimenter would urge them to go on with increasingly firm instructions, culminating in the statement, "You have no other choice, you must go on." The study ended when participants refused to obey the experimenter’s demand, or when they gave the learner the highest level of shock on the machine (450 volts).

Milgram found that participants obeyed the experimenter at an unexpectedly high rate: 65% of the participants gave the learner the 450-volt shock.

Critiques of the Milgram Experiment

The Milgram experiment has been widely criticized on ethical grounds. Milgram’s participants were led to believe that they acted in a way that harmed someone else, an experience that could have had long-term consequences. Moreover, an investigation by writer Gina Perry uncovered that some participants appear to not have been fully debriefed after the study —they were told months later, or not at all, that the shocks were fake and the learner wasn’t harmed. Milgram’s studies could not be perfectly recreated today, because researchers today are required to pay much more attention to the safety and well-being of human research subjects.

Researchers have also questioned the scientific validity of Milgram’s results. In her examination of the study, Perry found that Milgram’s experimenter may have gone off script and told participants to obey many more times than the script specified. Additionally, some research suggests that participants may have figured out that the learner was not harmed: in interviews conducted after the Milgram experiment, some participants reported that they didn’t think the learner was in any real danger. This mindset is likely to have affected their behavior in the study.

Variations on the Milgram Experiment

Milgram and other researchers conducted numerous versions of the experiment over time. The participants' levels of compliance with the experimenter’s demands varied greatly from one study to the next. For example, when participants were in closer proximity to the learner (e.g. in the same room), they were less likely to give the learner the highest level of shock.

Another version of the Milgram experiment brought three "teachers" into the experiment room at once. One was a real participant, and the other two were actors hired by the research team. During the experiment, the two non-participant teachers would quit as the level of shocks began to increase. Milgram found that these conditions made the real participant far more likely to "disobey" the experimenter, too: only 10% of participants gave the 450-volt shock to the learner.

In yet another version of the Milgram experiment, two experimenters were present, and during the experiment, they would begin arguing with one another about whether it was right to continue the study. In this version, none of the participants gave the learner the 450-volt shock.

Replicating the Milgram Experiment

Researchers have sought to replicate Milgram's original study with additional safeguards in place to protect participants. In 2009, Jerry Burger replicated Milgram’s famous experiment at Santa Clara University with new safeguards in place: the highest shock level was 150 volts, and participants were told that the shocks were fake immediately after the experiment ended. Additionally, participants were screened by a clinical psychologist before the experiment began, and those found to be at risk of a negative reaction to the study were deemed ineligible to participate.

Burger found that participants obeyed at similar levels as Milgram’s participants: 82.5% of Milgram’s participants gave the learner the 150-volt shock, and 70% of Burger’s participants did the same.

The Legacy of the Milgram Experiment

Milgram’s interpretation of his research was that everyday people are capable of carrying out unthinkable actions in certain circumstances. His research has been used to explain atrocities such as the Holocaust and the Rwandan genocide, though these applications are by no means widely accepted or agreed upon.

Importantly, not all participants obeyed the experimenter’s demands , and Milgram’s studies shed light on the factors that enable people to stand up to authority. In fact, as sociologist Matthew Hollander writes, we may be able to learn from the participants who disobeyed, as their strategies may enable us to respond more effectively to an unethical situation. The Milgram experiment suggested that human beings are susceptible to obeying authority, but it also demonstrated that obedience is not inevitable.

  • Baker, Peter C. “Electric Schlock: Did Stanley Milgram's Famous Obedience Experiments Prove Anything?” Pacific Standard (2013, Sep. 10). https://psmag.com/social-justice/electric-schlock-65377
  • Burger, Jerry M. "Replicating Milgram: Would People Still Obey Today?."  American Psychologist 64.1 (2009): 1-11. http://psycnet.apa.org/buy/2008-19206-001
  • Gilovich, Thomas, Dacher Keltner, and Richard E. Nisbett. Social Psychology . 1st edition, W.W. Norton & Company, 2006.
  • Hollander, Matthew. “How to Be a Hero: Insight From the Milgram Experiment.” HuffPost Contributor Network (2015, Apr. 29). https://www.huffingtonpost.com/entry/how-to-be-a-hero-insight-_b_6566882
  • Jarrett, Christian. “New Analysis Suggests Most Milgram Participants Realised the ‘Obedience Experiments’ Were Not Really Dangerous.” The British Psychological Society: Research Digest (2017, Dec. 12). https://digest.bps.org.uk/2017/12/12/interviews-with-milgram-participants-provide-little-support-for-the-contemporary-theory-of-engaged-followership/
  • Perry, Gina. “The Shocking Truth of the Notorious Milgram Obedience Experiments.” Discover Magazine Blogs (2013, Oct. 2). http://blogs.discovermagazine.com/crux/2013/10/02/the-shocking-truth-of-the-notorious-milgram-obedience-experiments/
  • Romm, Cari. “Rethinking One of Psychology's Most Infamous Experiments.” The Atlantic (2015, Jan. 28) . https://www.theatlantic.com/health/archive/2015/01/rethinking-one-of-psychologys-most-infamous-experiments/384913/
  • Gilligan's Ethics of Care
  • What Was the Robbers Cave Experiment in Psychology?
  • What Is Behaviorism in Psychology?
  • What Is the Zeigarnik Effect? Definition and Examples
  • What Is a Conditioned Response?
  • Psychodynamic Theory: Approaches and Proponents
  • Social Cognitive Theory: How We Learn From the Behavior of Others
  • Kohlberg's Stages of Moral Development
  • What's the Difference Between Eudaimonic and Hedonic Happiness?
  • Genie Wiley, the Feral Child
  • What Is the Law of Effect in Psychology?
  • What Is the Recency Effect in Psychology?
  • Heuristics: The Psychology of Mental Shortcuts
  • What Is Survivor's Guilt? Definition and Examples
  • 5 Psychology Studies That Will Make You Feel Good About Humanity
  • What Is Cognitive Bias? Definition and Examples
  • Skip to main content
  • Keyboard shortcuts for audio player

Author Interviews

Taking a closer look at milgram's shocking obedience study.

Behind the Shock Machine

Behind the Shock Machine

Buy featured book.

Your purchase helps support NPR programming. How?

  • Independent Bookstores

In the early 1960s, Stanley Milgram, a social psychologist at Yale, conducted a series of experiments that became famous. Unsuspecting Americans were recruited for what purportedly was an experiment in learning. A man who pretended to be a recruit himself was wired up to a phony machine that supposedly administered shocks. He was the "learner." In some versions of the experiment he was in an adjoining room.

The unsuspecting subject of the experiment, the "teacher," read lists of words that tested the learner's memory. Each time the learner got one wrong, which he intentionally did, the teacher was instructed by a man in a white lab coat to deliver a shock. With each wrong answer the voltage went up. From the other room came recorded and convincing protests from the learner — even though no shock was actually being administered.

The results of Milgram's experiment made news and contributed a dismaying piece of wisdom to the public at large: It was reported that almost two-thirds of the subjects were capable of delivering painful, possibly lethal shocks, if told to do so. We are as obedient as Nazi functionaries.

Or are we? Gina Perry, a psychologist from Australia, has written Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments . She has been retracing Milgram's steps, interviewing his subjects decades later.

"The thought of quitting never ... occurred to me," study participant Bill Menold told Perry in an Australian radio documentary . "Just to say: 'You know what? I'm walking out of here' — which I could have done. It was like being in a situation that you never thought you would be in, not really being able to think clearly."

In his experiments, Milgram was "looking to investigate what it was that had contributed to the brainwashing of American prisoners of war by the Chinese [in the Korean war]," Perry tells NPR's Robert Siegel.

Interview Highlights

On turning from an admirer of Milgram to a critic

"That was an unexpected outcome for me, really. I regarded Stanley Milgram as a misunderstood genius who'd been penalized in some ways for revealing something troubling and profound about human nature. By the end of my research I actually had quite a very different view of the man and the research."

Watch A Video Of One Of The Milgram Obedience Experiments

On the many variations of the experiment

"Over 700 people took part in the experiments. When the news of the experiment was first reported, and the shocking statistic that 65 percent of people went to maximum voltage on the shock machine was reported, very few people, I think, realized then and even realize today that that statistic applied to 26 of 40 people. Of those other 700-odd people, obedience rates varied enormously. In fact, there were variations of the experiment where no one obeyed."

On how Milgram's study coincided with the trial of Nazi officer Adolf Eichmann — and how the experiment reinforced what Hannah Arendt described as "the banality of evil"

"The Eichmann trial was a televised trial and it did reintroduce the whole idea of the Holocaust to a new American public. And Milgram very much, I think, believed that Hannah Arendt's view of Eichmann as a cog in a bureaucratic machine was something that was just as applicable to Americans in New Haven as it was to people in Germany."

On the ethics of working with human subjects

"Certainly for people in academia and scholars the ethical issues involved in Milgram's experiment have always been a hot issue. They were from the very beginning. And Milgram's experiment really ignited a debate particularly in social sciences about what was acceptable to put human subjects through."

milgram experiment factors

Gina Perry is an Australian psychologist. She has previously written for The Age and The Australian. Chris Beck/Courtesy of The New Press hide caption

Gina Perry is an Australian psychologist. She has previously written for The Age and The Australian.

On conversations with the subjects, decades after the experiment

"[Bill Menold] doesn't sound resentful. I'd say he sounds thoughtful and he has reflected a lot on the experiment and the impact that it's had on him and what it meant at the time. I did interview someone else who had been disobedient in the experiment but still very much resented 50 years later that he'd never been de-hoaxed at the time and he found that really unacceptable."

On the problem that one of social psychology's most famous findings cannot be replicated

"I think it leaves social psychology in a difficult situation. ... it is such an iconic experiment. And I think it really leads to the question of why it is that we continue to refer to and believe in Milgram's results. I think the reason that Milgram's experiment is still so famous today is because in a way it's like a powerful parable. It's so widely known and so often quoted that it's taken on a life of its own. ... This experiment and this story about ourselves plays some role for us 50 years later."

Related NPR Stories

Shocking TV Experiment Sparks Ethical Concerns

Shocking TV Experiment Sparks Ethical Concerns

How stanley milgram 'shocked the world', research news, scientists debate 'six degrees of separation'.

  • Teaching Resources
  • Upcoming Events
  • On-demand Events

A Matter of Obedience?

Available in.

  • The Holocaust
  • facebook sharing
  • email sharing

Obedience: The Milgram Experiment

Three decades before Christopher Browning completed his study of Police Battalion 101 (see reading, Reserve Police Battalion 101 ), a psychologist at Yale University named Stanley Milgram also tried to better understand why so many individuals participated in the brutality and mass murder of the Holocaust. 

In the 1960s, Milgram conducted an experiment designed “to see how far a person will proceed in a concrete and measurable situation in which he is ordered to inflict increasing pain on a protesting victim.” 1 Joseph Dimow was one of the people who unknowingly took part in that experiment. In 2004, he described the experience:

Like many others in the New Haven area, I answered an ad seeking subjects for the experiment and offering five dollars, paid in advance, for travel and time. At the Yale facility, I met a man . . . in a white coat and horn-rimmed glasses. He led me into a room filled with an impressive display of electrical equipment. A second man was introduced to me as another subject for the experiment, and together we were told that the experiment was to test the widely held belief that people learn by punishment. In this case, one of us would be a “learner” and the other a “teacher.” The teacher would read a list of paired words . . . and then repeat the first word of the pair. If the learner did not respond with the correct second word, the teacher would deliver a “mild” electric shock to the learner as punishment. . . . The “professor” said we would draw straws to see which of us would be the learner. He offered the straws to the other man [and] then [the man] announced that he had drawn the short straw and would be the learner . . . The learner, said the professor, would be in an adjoining room, out of my sight, and strapped to a chair so that his arms could not move—this so that the learner could not jump around and damage the equipment or do harm to himself. I was to be seated in front of a console marked with lettering colored yellow for “Slight Shock” (15 volts) up to purple for “Danger: Severe Shock” (450 volts). The shocks would increase by 15-volt increments with each incorrect answer. 2

In fact, the “learner” was an actor hired by Milgram. Dimow, the “teacher,” was the person Milgram and his team were studying. Social scientists Nestar Russell and Robert Gregory explain how the experiment was set up:

Once the experiment has started, the participant [the “teacher”] is soon required to deliver shocks of increasing intensity. In fact, no shocks at all are being administered, though the participant does not know this. As the “shocks” increase in intensity, the ostensible pain being experienced by the learner also becomes increasingly apparent by way of shouts and protests (actually via a tape recording) emanating from behind a partition that visually separates the teacher from the learner. For example, at 120 volts the learner is heard to say “Ugh! Hey, this really hurts!” Typically, the participants express their concern over the learner’s well-being. Yet the experimenter continues to insist “The experiment requires that you continue,” “You have no other choice, you must go on.” Such commands were designed to generate feelings of tension—what Milgram called strain—within the participant. If the participant continued to obey these strain-producing commands to the 270-volt level, the learner, in obvious agony, was heard to scream, “Let me out of here. Let me out of here. Let me out of here. Let me out. Do you hear? Let me out of here!” At the 300-volt level, the learner refuses to answer and instead responds with agonized screams. The experimenter commands the participant to treat further unanswered questions as incorrect and accordingly to inflict the next level of shock. After a 330-volt shock has been administered, the learner suddenly falls silent. The participant is again ordered to treat any further unanswered questions as incorrect and to continue administering shocks of increasing voltage. Once the participant has administered three successive shocks of 450 volts, the experimenter stops the process. 3

After a session of the experiment was complete, Milgram’s team revealed to the participant that he or she had been deceived, and they brought the “learner” into the room so that the participant could see that he had not been harmed. Regardless, this deception, in which the subject of an experiment is tricked into believing that he or she is harming another individual, is widely considered to be unethical today. At the time, when Milgram described this experiment to a group of 39 psychiatrists, the psychiatrists predicted that one participant in 1,000 would continue until he or she delivered the most severe shock, 450 volts. In reality, 62.5% of participants did. 

By varying the setup of his experiment, Milgram observed a relationship between the distance separating the teacher and learner and the willingness of the teacher to generate more severe shocks. When the teacher was required to touch the learner by forcing the learner’s hand onto the plate from which the shock was delivered, 30% of the teachers proceeded to the most severe shock. When the teacher did not touch the learner but remained in the same room, obedience to go all the way increased to 40%. When the teachers were placed in a separate room from which they could hear the voice of the learner but not see him, obedience increased to 62.5%. When the learner did not speak but only banged on the wall to indicate distress, obedience increased to 65%. When the teacher could neither see nor hear the learner at all, obedience reached almost 100%. 4 Milgram tested other variations in which the distance between the experimenter and the teacher changed. He found that the farther the distance between experimenter and teacher, the less likely the teacher was to obey. Milgram concluded that the experiment forced the teacher to decide between two stressful situations: inflicting pain on another person and disobeying authority. The closeness of the learner and the experimenter to the teacher affected the teacher’s choice: “In obeying, the participants were mainly concerned about alleviating their own, rather than the learner’s, stressful situation.” 5

In interpreting the implications of Milgram’s research, many, including Milgram himself, focused on the effect of physical closeness between perpetrator and victim on the willingness of one person to harm another. Sociologist Zygmunt Bauman writes: 

It is difficult to harm a person we touch. It is somewhat easier to afflict pain upon a person we only see at a distance. It is still easier in the case of a person we only hear. It is quite easy to be cruel towards a person we neither see nor hear. 6

But others who study Milgram’s work argue that focusing primarily on physical distance leaves out other important factors suggested by the experiment. Russell and Gregory argue that “emotional distance” is an equally important factor. In their analysis of the Milgram experiments, they write:

Although the . . . learner was deliberately chosen as a likable, middle-aged man, and although many participants expressed strong concern about his apparent plight—and were relieved to be reconciled with him at the end of the experiment—he was a stranger to them. Milgram speculated that obedience rates may have been even higher had the learner been presented as “a brutal criminal or a pervert”; but obedience rates may also have been much lower overall had the learner been a loved member of the participant’s family, a friend, or even an acquaintance. So Milgram confirmed what most people instinctively know—that it is far easier to maltreat others if they are personal strangers, even easier to do so if they are cultural strangers, and especially if we engage in rationalization processes of self-deception that serve to dehumanize them. 7

Russell and Gregory also believe that the way the harm is inflicted would affect the willingness of individuals to do it. In their analysis of the Milgram experiments, they point out that the shock generator was a technological and indirect way for the teacher to inflict pain; in most variations, teachers flicked a switch rather than using “direct physical force.” Russell and Gregory ask: “How far would Milgram’s participants have gone if they had been required personally to beat, bludgeon, or whip the learner, ultimately to the point of unconsciousness or beyond?” 8

Milgram’s experiments provide insights that help us understand the choices and motivations of many who participated in the Nazi programs of persecution and mass murder. But many historians and social scientists who have studied the Holocaust say that Milgram’s work does not fully explain the behavior of perpetrators in the Holocaust. While many acted in response to orders from authority figures, some perpetrators chose to go beyond the orders they were given. Others chose to act out of their own hatred or for their own material gain without being asked to do so. Even within the German government and military, leaders and bureaucrats took initiative and devised creative methods to achieve larger goals, not in response to orders but in an effort to “work toward the Führer” (see reading, Working Toward the Führer in Chapter 5). 

  • 1 Stanley Milgram, Obedience to Authority: An Experimental View (New York: Harper & Row, 1974), 3–4.
  • 2 Joseph Dimow, “ Resisting Authority: A Personal Account of the Milgram Obedience Experiments ,” Jewish Currents (January 2004), accessed June 18, 2016.
  • 3 Nestar Russell and Robert Gregory, “Making the Undoable Doable: Milgram, the Holocaust, and Modern Government,” American Review of Public Administration 35, no. 4 (December 4, 2005), 328–329.
  • 4 Nestar Russell and Robert Gregory, “Making the Undoable Doable: Milgram, the Holocaust, and Modern Government,” American Review of Public Administration 35, no. 4 (December 4, 2005), 330.
  • 5 Nestar Russell and Robert Gregory, “Making the Undoable Doable: Milgram, the Holocaust, and Modern Government,” American Review of Public Administration 35, no. 4 (December 4, 2005), 331.
  • 6 Zygmunt Bauman, Modernity and the Holocaust (Hoboken, NJ: Wiley, 2013), 155.
  • 7 Nestar Russell and Robert Gregory, “Making the Undoable Doable: Milgram, the Holocaust, and Modern Government,” American Review of Public Administration 35, no. 4 (December 4, 2005), 332.
  • 8 Nestar Russell and Robert Gregory, “Making the Undoable Doable: Milgram, the Holocaust, and Modern Government,” American Review of Public Administration 35, no. 4 (December 4, 2005), 333–34.

Connection Questions

  • What encourages obedience? What factors do the Milgram experiments suggest? What factors do these experiments leave out?
  • How do the Milgram experiments explain aspects of perpetrators' actions in the Holocaust? What do the experiments fail to explain?
  • What situation caused “feelings of tension” in participants in the Milgram experiments? What role did the distance between “teacher” and “learner” play in creating these feelings? What role did the distance between “teacher” and “experimenter” play? 
  • What is the difference between physical distance and “emotional” distance? According to Russell and Gregory, what difference might the emotional distance between “teacher” and “learner” make in the willingness of the “teacher” to harm the “learner”? What might have created emotional distance between perpetrators and victims during the Holocaust?
  • Sociologist Zygmunt Bauman writes: “The most frightening news brought about by the Holocaust and by what we learned of its perpetrators was not the likelihood that ‘this’ could be done to us, but the idea that we could do it.” 1  Do you agree that everyone has the potential to become a perpetrator? What do the Milgram experiments suggest about the aspects of human behavior that could make it possible for us to willingly inflict pain on others?
  • Some who played a role in mass murder during the Holocaust later tried to explain their actions by saying that they were simply obeying the orders of authority figures. Historian Daniel Goldhagen warns that this sort of “blind obedience” is not a sufficient explanation, because it leaves out the extreme form of antisemitism that he believes motivated the German killers. He writes that individuals will only obey orders that are consistent with the values and morals they already hold. 2  What does he mean? What is he suggesting about the moral values and beliefs of perpetrators, such as the members of Police Battalion 101? Based on what you have learned so far, do you agree or disagree with Goldhagen? 
  • 1 Zygmunt Bauman, Modernity and the Holocaust (Hoboken, NJ: Wiley, 2013), 152.
  • 2 Daniel Goldhagen, Hitler's Willing Executioners: Ordinary Germans and the Holocaust (London: Knopf, 1996), 383.

Get the Handout

  • document A Matter of Obedience? – PDF
  • document A Matter of Obedience? – Doc

How to Cite This Reading

Facing History & Ourselves, “ A Matter of Obedience? ”, last updated August 2, 2016.

You might also be interested in…

Dismantling democracy, world war: choices and consequences, do you take the oath, european jewish life before world war ii, exploring identity, the weimar republic: the fragility of democracy, the holocaust: bearing witness, how should we remember, introducing the unit, the holocaust: the range of responses, the national socialist revolution, kristallnacht, inspiration, insights, & ways to get involved.

  • Skip to main content
  • Skip to primary sidebar

IResearchNet

Stanley Milgram’s Experiment

Stanley Milgram was one of the most influential social psychologists of the twentieth century. Born in 1933 in New York, he obtained a BA from Queen’s College, and went on to receive a PhD in psychology from Harvard. Subsequently, Milgram held faculty positions in psychology at Yale University and the City University of New York until his untimely death in 1984. Although Milgram never held a formal appointment in sociology, his work was centrally focused on the social psychological aspects of social structure.

Stanley Milgram’s Experiment

In a historic coincidence, in 1961, just as Milgram was about to begin work on his famous obedience experiments, the world witnessed the trial of Adolf Otto Eichmann, a high ranking Nazi official who was in charge of organizing the transport of millions of Jews to the death camps. To many, Eichmann appeared not at all to be the fervent anti Semite that many had suspected him to be; rather, his main defense was that he was only ‘‘following orders’’ as an administrator. To the political theorist Hannah Arendt, Eichmann’s case illustrated the ‘‘banality of evil,’’ in which personal malice appeared to matter less than the desire of individuals to fulfill their roles in the larger context of a bureaucracy. Milgram’s research is arguably the most striking example to illustrate this dynamic.

Milgram planned and conducted his obedience experiments between 1960 and 1963 at Yale University. In order to be able to study obedience to authority, he put unsuspecting research participants in a novel situation, which he staged in the laboratory. With the help of actors and props, Milgram set up an experimental ruse that was so real that hardly any of his research participants suspected that, in reality, nothing was what it pretended to be.

For this initial study, using newspaper ads promising $4.50 for participation in a psychological study, Milgram recruited men aged 20 to 50, ranging from elementary school drop outs to PhDs. Each research participant arrived in the lab along with another man, white and roughly 30 years of age, whom they thought to be another research participant. In reality, this person was a confederate, that is, an actor in cahoots with the experimenter. The experimenter explained that both men were about to take part in a study that explored the effect of punishment on memory. One man would assume the role of a ‘‘teacher’’ who would read a series of word pairings (e.g., nice day, blue box), which the other (‘‘the learner’’) was supposed to memorize. Subsequently, the teacher would read the first word of the pair with the learner having to select the correct second word from a list. Every mistake by the learner would be punished with an electric shock. It was further made clear that, although the shocks would be painful, they would not do any permanent harm.

Following this explanation, the experimenter assigned both men to the roles. Because the procedure was rigged, the unsuspecting research participant always was assigned to the role of teacher. As first order of business, the learner was seated in an armchair in an adjoining room such that he would be separated by a wall from the teacher, but would other wise be able to hear him from the main room. Electrodes were affixed to the learner’s arms, who was subsequently strapped to the chair apparently to make sure that improper movements would not endanger the success of the experiment.

In the main room, the teacher was told that he would have to apply electric shocks every time the learner made a mistake. For this purpose, the learner was seated in front of an electric generator with various dials. The experimenter instructed the teacher to steadily increase the voltage of the shock each time the learner made a new mistake. The shock generator showed a row of levers ranging from 15 volts on the left to 450 volts on the right, with each lever in between delivering a shock 15 volts higher than its neighbor on the left. Milgram labeled the voltage level, left to right, from ‘‘Slight Shock’’ to ‘‘Danger: Severe Shock,’’ with the last two switches being marked ‘‘XXX.’’ The teacher was told that he simply should work his way from the left to the right without using any lever twice. To give the teacher an idea of the electric current he would deliver to the learner, he received a sample shock of 45 volts, which most research participants found surprisingly painful. However, despite its appearance, in reality the generator never emitted any electric shocks. It was merely a device that allowed Milgram to examine how far the teacher would go in harming another person based on the experimenter’s say so.

As learning trials started, the teacher applied electric shocks to the learner. The learner’s responses were scripted such that he apparently made many mistakes, requiring the teacher to increase shock levels by 15 volts with every new mistake. As the strength of electric shocks increased, occasional grunts and moans of pain were heard from the learner. At 120 volts the learner started complaining about the pain. At 150 volts, the learner demanded to be released on account of a heart condition, and the protest continued until the shocks reached 300 volts and the learner started pounding on the wall. At 315 volts the learner stopped responding altogether.

As the complaints by the learner started, the teacher would often turn to the experimenter, who was seated at a nearby desk, wondering whether and how to proceed. The experimenter, instead of terminating the experiment, replied with a scripted succession of prods:

  • Prod 1: ‘‘Please continue.’’
  • Prod 2: ‘‘The experiment requires that you continue.’’
  • Prod 3: ‘‘It is absolutely necessary to continue.’’
  • Prod 4: ‘‘You have no other choice: you must go on.’’

These prods were successful in coaxing many teachers into continuing to apply electric shocks even when the learner no longer responded to the word memory questions. Indeed, in the first of Milgram’s experiments, a stunning 65 percent of all participants continued all the way to 450 volts, and not a single participant refused to continue the shocks before they reached the 300 volt level! The high levels of compliance illustrate the powerful effect of the social structure that participants had entered. By accepting the role of teacher in the experiment in exchange for the payment of a nominal fee, participants had agreed to accept the authority of the experimenter and carry out his instructions. In other words, just as Milgram suspected, the social forces of hierarchy and obedience could push normal and well adjusted individuals into harming others.

The overall level of obedience, however, does not reveal the tremendous amount of stress that all teachers experienced. Because the situation was extremely realistic, teachers were agonizing over whether or not to continue the electric shocks. Should they care for the well being of the obviously imperiled learners and even put their life in danger? Or should they abide by a legitimate authority figure, who presented his instructions crisply and confidently? Participants typically sought to resolve this conflict by seeking assurances that the experimenter, and not themselves, would accept full responsibility for their actions. Once they felt assured, they typically continued to apply shocks that would have likely electrocuted the learner.

Milgram expanded his initial research into a series of 19 experiments in which he carefully examined the conditions under which obedience would occur. For instance, the teacher’s proximity to the learner was an important factor in lowering obedience, that is, the proportion of people willing to deliver the full 450 volts. When the teacher was in the same room with the learner, obedience dropped to 40 percent, and when the teacher was required to touch the learner and apply physical force to deliver the electric shock, obedience dropped to 30 percent.

Milgram further suspected that the social status of the experimenter, presumably a serious Yale University researcher in a white lab coat, would have important implications for obedience. Indeed, when there was no obvious connection with Yale, and the above experiment was repeated in a run down office building in Bridgeport, Connecticut, obedience dropped to 48 percent. Indeed, when not the white coated experimenter but another confederate encouraged the teacher to continue the shocks, all participants terminated the experiment as soon as the confederate complained. Milgram concluded that ‘‘a substantial proportion of people do what they are told to do, irrespective of the content of the act and with out limitations of conscience, so long as they perceive that the command comes from a legitimate authority’’ (1965). However, additional studies highlighted that obedience is in part contingent on surveillance. When the experimenter transmitted his orders not in person but via telephone, obedience levels dropped to 20 percent, with many participants only pretending to apply higher and higher electric shocks.

Since its initial publication in 1963, Mil gram’s research has drawn a lot of criticism, mainly on ethical grounds. First, it was alleged that it was unethical to deceive participants to the extent that occurred in these studies. It is important to note that all participants were fully debriefed on the deception, and most did not seem to mind and were relieved to find out that they had not shocked the learner. The second ethical criticism is, however, much more serious. As alluded to earlier, Milgram exposed his participants to tremendous levels of stress. Milgram, anticipating this criticism, inter viewed participants after the experiment and followed up several weeks later. The over whelming majority of his participants commented that they enjoyed being in the experiment, and only a small minority experienced regret. Even though personally Milgram rejected allegations of having mistreated his participants, his own work suggests that he may have gone too far: ‘‘Subjects were observed to sweat, tremble, bite their lips, groan, and dig their fingernails into their flesh . . . A mature and initially poised businessman entered the laboratory smiling and confident. Within 20 minutes, he was reduced to a twitching, stuttering wreck who was rapidly approaching a point of nervous collapse’’ (1963: 375). Today, Milgram’s obedience studies are generally considered unethical and would not pass muster with regard to contemporary regulations protecting the well being of research participants. Ironically, partly because Milgram’s studies illustrated the power of hierarchical social relationships, contemporary researchers are at great pains to avoid coercion and allow participants to terminate their participation in any research study at any time without penalty.

Another type of criticism of the obedience studies has questioned their generality and charged that their usefulness in explaining real world events is limited. Indeed, Milgram conducted his research when trust in authorities was higher than it is nowadays. However, Milgram’s studies have withstood this criticism. Reviews of research conducted using Milgram’s paradigm have generally found obedience levels to be at roughly 60 percent (see, e.g., Blass 2000). In one of his studies Milgram further documented that there was no apparent difference in the responses of women and men. More recent research using more ethically acceptable methods further testifies to the power of obedience in shaping human action (Blass 2000).

Milgram offers an important approach to explaining the Holocaust by emphasizing the bureaucratic nature of evil, which relegated individuals to executioners of orders issued by a legitimate authority. Sociologists have extended this analysis and provided compelling accounts of obedience as root causes of many horrific crimes, ranging from the My Lai massacre to Watergate (Hamilton & Kelman 1989). How ever, it is arguably somewhat unclear to what extent Milgram’s findings can help explain the occurrence of the Holocaust itself. Whereas obedience kept the machinery of death running with frightening efficiency, historians often caution against ignoring the malice and sadism that many of Hitler’s executioners brought to the task (see Blass 2004).

Milgram’s dramatic experiments have left a lasting impression beyond the social sciences. They are the topic of various movies, including the 1975 TV film The Tenth Level starring William Shatner. Further, the 37 percent of participants who did not obey were memorialized in a 1986 song by the rock musician Peter Gabriel titled ‘‘We Do What We’re Told (Milgram’s 37).’’

References:

  • Blass, T. (Ed.) (2000) Obedience to Authority: Current Perspectives on the Milgram Paradigm. Erlbaum, Mahwah, NJ.
  • Blass, T. (2004) The Man Who Shocked the World: The Life and Legacy of Stanley Milgram. Basic Books, New York.
  • Hamilton, V. L. & Kelman, H. (1989) Crimes of Obedience: Toward a Social Psychology of Authority and Responsibility. Yale University Press, New Haven.
  • Milgram, S. (1963) Behavioral Study of Obedience. Journal of Abnormal and Social Psychology 69: 371-8.
  • Milgram, S. (1965) Some Conditions of Obedience and Disobedience to Authority. Human Relations 18: 57-76.
  • Milgram, S. (1974) Obedience to Authority: An Experimental View. Harper & Row, New York.
  • Tools and Resources
  • Customer Services
  • Affective Science
  • Biological Foundations of Psychology
  • Clinical Psychology: Disorders and Therapies
  • Cognitive Psychology/Neuroscience
  • Developmental Psychology
  • Educational/School Psychology
  • Forensic Psychology
  • Health Psychology
  • History and Systems of Psychology
  • Individual Differences
  • Methods and Approaches in Psychology
  • Neuropsychology
  • Organizational and Institutional Psychology
  • Personality
  • Psychology and Other Disciplines
  • Social Psychology
  • Sports Psychology
  • Share Facebook LinkedIn Twitter

Article contents

Milgram’s experiments on obedience to authority.

  • Stephen Gibson Stephen Gibson Heriot-Watt University, School of Social Sciences
  • https://doi.org/10.1093/acrefore/9780190236557.013.511
  • Published online: 30 June 2020

Stanley Milgram’s experiments on obedience to authority are among the most influential and controversial social scientific studies ever conducted. They remain staples of introductory psychology courses and textbooks, yet their influence reaches far beyond psychology, with myriad other disciplines finding lessons in them. Indeed, the experiments have long since broken free of the confines of academia, occupying a place in popular culture that is unrivaled among psychological experiments. The present article begins with an overview of Milgram’s account of his experimental procedure and findings, before focussing on recent scholarship that has used materials from Milgram’s archive to challenge many of the long-held assumptions about the experiments. Three areas in which our understanding of the obedience experiments has undergone a radical shift in recent years are the subject of particular focus. First, work that has identified new ethical problems with Milgram’s studies is summarized. Second, hitherto unknown methodological variations in Milgram’s experimental procedures are considered. Third, the interactions that took place in the experimental sessions themselves are explored. This work has contributed to a shift in how we see the obedience experiments. Rather than viewing the experiments as demonstrations of people’s propensity to follow orders, it is now clear that people did not follow orders in Milgram’s experiments. The experimenter did a lot more than simply issue orders, and when he did, participants found it relatively straightforward to defy them. These arguments are discussed in relation to the definition of obedience that has typically been adopted in psychology, the need for further historical work on Milgram’s experiments, and the possibilities afforded by the development of a broader project of secondary qualitative analysis of laboratory interaction in psychology experiments.

  • experimentation
  • interaction
  • standardization

You do not currently have access to this article

Please login to access the full content.

Access to the full content requires a subscription

Printed from Oxford Research Encyclopedias, Psychology. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 17 September 2024

  • Cookie Policy
  • Privacy Policy
  • Legal Notice
  • Accessibility
  • [185.80.149.115]
  • 185.80.149.115

Character limit 500 /500

milgram experiment factors

The Stanley Milgram Experiment: Understanding Obedience

May 3, 2023

Discover the intriguing Stanley Milgram Experiment, exploring obedience to authority & human nature. Uncover shocking results & timeless insights.

Main, P (2023, May 03). The Stanley Milgram Experiment: Understanding Obedience. Retrieved from https://www.structural-learning.com/post/stanley-milgram-experiment

What was the Stanley Milgram experiment?

The Stanley Milgram experiment is one of the most famous and controversial studies in the history of psychology. The study was conducted in the early 1960s, and it examined people's willingness to obey an authority figure , even when that obedience caused harm to others. In this article, we'll take a closer look at the Milgram experiment, its significance, and its impact on psychology.

The Milgram experiment was designed to test people's willingness to obey authority, even when that obedience caused harm to others. The study involved three participants: the experimenter, the learner, and the teacher. The learner was actually a confederate of the experimenter, and the teacher was the real participant.

The teacher was instructed to administer electric shocks to the learner whenever the learner gave a wrong answer to a question. The shocks started at a low level and increased in intensity with each wrong answer. The learner was not actually receiving shocks, but they pretended to be in pain and begged the teacher to stop. Despite this, the experimenter instructed the teacher to continue shocking the learner.

The results of the Milgram experiment were shocking. Despite the learner's protests, the majority of participants continued to administer shocks to the maximum level, even when they believed that the shocks were causing serious harm.

The Milgram experiment is perhaps one of the most well-known experiments on obedience in psychology . Milgram's original study involved 40 participants who were instructed to deliver electric shocks to a confederate, who pretended to be receiving shocks.

The shocks were delivered via a "shock machine" and ranged in severity from slight shocks to severe shocks. Despite the confederate's cries of pain and protest, the majority of participants continued to administer shocks up to the maximum level, demonstrating high rates of obedience to authority figures.

Milgram's experiments on obedience generated a great deal of interest and controversy in the scientific community. The results of his study challenged commonly held beliefs about human behavior and the limits of individual autonomy . The study also raised important ethical concerns and spurred a renewed focus on informed consent and debriefing in behavioral research.

In subsequent variations of the experiment, Milgram sought to explore the factors that influenced obedience rates, such as the presence of peers or the proximity of the authority figure. These variations provided further insight into the complex nature of obedience and social influence .

The Milgram experiment remains a significant and influential study in the field of social psychology, providing valuable insights into the power of authority and the limits of individual autonomy. Despite its ethical concerns, Milgram's study continues to be discussed and debated by scholars and students alike, highlighting the enduring impact of this groundbreaking behavioral study.

Who was Stanley Milgram?

Stanley Milgram was a renowned American social psychologist who was born in New York City in 1933. He received his PhD in Social Psychology from Harvard University in 1960 and went on to teach at Yale University, where he conducted his famous obedience experiments. Milgram's research focused on the areas of personality and social psychology, and he is best known for his studies on obedience to authority figures.

Milgram's obedience experiments were controversial and sparked a great deal of debate in the field of psychology. His research showed that ordinary people were capable of inflicting harm on others when instructed to do so by an authority figure. Milgram's work had a profound impact on the field of social psychology and influenced other researchers, such as Philip Zimbardo , to study similar topics.

Milgram's contributions to the field of social psychology were significant, and his obedience experiments remain some of the most well-known and widely discussed studies in the history of psychology. Despite the controversy surrounding his work, Milgram's research continues to be taught in psychology courses around the world and has had a lasting impact on our understanding of obedience, authority, and human behavior.

Stanley Milgram with shock generator

Milgram's Independent Variables

As we have seen, in Stanley Milgram's famous experiment conducted at Yale University in the 1960s, he sought to investigate the extent to which ordinary people would obey the commands of an authority figure, even if it meant administering severe electric shocks to another person.

The study of obedience to authority figures was a fundamental aspect of Milgram's research in social psychology. To explore this phenomenon, Milgram manipulated several independent variables in his experiment. One key independent variable was the level of shock administered by the participants, ranging from slight shocks to increasingly severe shocks, labeled with corresponding shock levels.

Another independent variable was the proximity of the authority figure, with variations of physical proximity or remote instruction via telephone.

Additionally, the presence or absence of social pressure from others and the authority figure's attire, varying between a lab coat and everyday clothing, were also manipulated.

Through these carefully controlled independent variables, Milgram examined the obedience rates and the level of obedience demonstrated by the participants in response to the concrete situation created in his experiment.

Change of Location

One significant factor that influenced the results of the Milgram experiment was the change of location. Originally conducted at Yale University, the experiment was later moved to a set of run-down offices in Bridgeport, Connecticut. This change had a profound impact on the rates of obedience observed in the study.

In the original experiment at Yale University, the obedience rates were shockingly high, with approximately 65% of participants following the instructions of the authority figure to administer what they believed to be increasingly severe electric shocks to another person. However, when the experiment was relocated to the less prestigious and less authoritative setting of run-down offices, the obedience rates dropped significantly to 47.5%.

This change in location created a shift in the dynamic of the experiment . Participants were less likely to view the authority figure as credible or legitimate in the less prestigious environment. The environment in run-down offices appeared less official and therefore may have weakened the perceived authority of the experimenter. This resulted in a lower level of obedience observed among the participants.

The change in location in the Milgram experiment demonstrated the influence of contextual factors on obedience rates. It highlighted how obedience to authority figures can be influenced by the specific setting in which individuals find themselves. The study serves as a reminder that obedience is not solely determined by individual characteristics but is also shaped by situational factors such as the environment and perceived authority.

In conclusion, the change of location from Yale University to run-down offices had a significant impact on the obedience rates in the Milgram experiment. The move resulted in a drop in obedience, suggesting that the context in which the experiment took place influenced participants' responses to authority .

Milgram Experiment Results

One important aspect of Stanley Milgram's obedience experiment was the role of the experimenter's uniform, specifically the lab coat. The uniform or attire worn by the authority figure in the experiment played a significant role in influencing obedience levels among the participants.

The lab coat served as a symbol of authority and expertise, creating a sense of credibility and legitimacy for the experimenter. By wearing the lab coat, the authority figure appeared more knowledgeable and trustworthy, which influenced participants to follow their instructions more readily.

The uniform also helped establish a clear power dynamic between the authority figure and the participants. The experimenter's attire reinforced the perception of being in a formal and professional setting, where obedience to authority was expected.

Milgram's experiment included variations to the uniform to examine its impact on obedience levels. In some versions of the experiment, the experimenter wore regular clothing instead of the lab coat. This modification significantly reduced the perceived authority of the experimenter, leading to lower levels of obedience among the participants.

By manipulating the presence or absence of the lab coat, Milgram demonstrated how even a simple change in attire could influence obedience levels . This emphasized the role of external factors, such as the uniform, in shaping human behavior in a social context.

Touch Proximity Condition

In the Touch Proximity Condition of the Milgram experiment, participants were subjected to a unique and intense situation that aimed to test the limits of their obedience to authority. In this particular condition, when the learner refused to participate after reaching 150 volts, the participants were required to physically force the learner's hand onto a shock plate. This manipulation was intended to eliminate the psychological buffer that existed between the participants and the consequences of their actions.

The introduction of touch proximity significantly altered the dynamics of the experiment. The physical act of forcing the learner's hand onto the shock plate made the participants more directly responsible for the pain and discomfort experienced by the learner. This direct physical connection to the consequences of their actions created a profound impact on the participants, leading to a notable decrease in obedience levels.

In the Touch Proximity Condition, obedience rates dropped to just 30%, highlighting the significant influence of the removal of the buffer between the participants and the consequences of their actions. The participants were confronted with the immediate and tangible effects of their obedience, which made it much more difficult to justify their continued compliance.

Overall, the Touch Proximity Condition revealed the critical role that the removal of psychological distance plays in obedience to authority. By eliminating the buffer between the participants and the consequences of their actions, Milgram's experiment demonstrated the tremendous impact that immediate physical proximity can have on individuals' behavior in a difficult and morally challenging situation.

Milgram Experiment Summary

Two Teacher Condition

In Milgram's Two Teacher Condition, participants were given the opportunity to instruct an assistant, who was actually a confederate, to press the switches administering electric shocks to the learner. This variation aimed to investigate the impact of participants assuming a more indirect role in the act of shocking the learner.

Surprisingly, the results showed that in this condition, a staggering 92.5% of participants instructed the assistant to deliver the maximum voltage shock. This high rate of obedience indicated that participants were willing to exert their authority over the assistant to carry out the harmful actions.

The Two Teacher Condition aligns with Milgram's Agency Theory, which suggests that people tend to obey authority figures when they perceive themselves as agents carrying out instructions rather than personally responsible. In this variation, participants may have seen themselves as simply giving orders rather than directly causing harm, which diminished their sense of personal responsibility and increased their obedience.

This condition demonstrates how the dynamic of obedience can change when individuals are given the opportunity to delegate harmful actions to others. It sheds light on the complex interplay between authority figures, personal responsibility, and obedience to explain the unexpected and alarming levels of compliance observed in the Milgram experiment.

Social Support Condition

In the Social Support Condition of Stanley Milgram's experiment, participants were not alone in their decision-making process. They were joined by two additional individuals who acted as confederates. The purpose of this condition was to assess the impact of social support on obedience.

The presence of these confederates who refused to obey the authority figure had a significant effect on the level of obedience observed. When one or both confederates refused to carry out the harmful actions, participants became more likely to question the legitimacy of the authority figure's commands and were less willing to comply.

The specific actions taken by the two confederates involved expressing their refusal to deliver the electric shocks. They openly dissented and voiced their concerns regarding the ethical implications of the experiment. These actions served as powerful examples of disobedience and created an atmosphere of social support for the participants.

As a result, the level of obedience decreased in the presence of these defiant confederates. Seeing others defy the authority figure empowered participants to assert their own autonomy and resist carrying out the harmful actions. The social support provided by the confederates challenged the participants' perception of the experiment as a concrete situation and encouraged them to question the legitimacy of the authority figure's instructions.

Overall, the Social Support Condition demonstrated that the presence of individuals who refused to obey had a profound influence on the level of obedience observed. This highlights the importance of social support in challenging authority and promoting ethical decision-making.

Milgrams obedience experiment

Absent Experimenter Condition

In Stanley Milgram's famous obedience experiment, the proximity of authority figures played a crucial role in determining the level of obedience observed. One particular condition, known as the Absent Experimenter Condition, shed light on the impact of physical proximity on obedience.

In this condition, the experimenter instructed the teacher, who administered the electric shocks, by telephone from another room. The results were striking. Obedience plummeted to a mere 20.5%, indicating that when the authority figure was not physically present, participants were much less inclined to obey.

Without the immediate presence of the experimenter, many participants displayed disobedience or cheated by administering lesser shocks than instructed. This deviation from the experimenter's orders suggests that the absence of the authority figure weakened the participants' sense of obligation and decreased their willingness to comply.

The findings of the Absent Experimenter Condition highlight the significant influence of proximity on obedience. When the authority figure was physically present, participants were more likely to obey, even when faced with morally challenging actions. However, when the authority figure was not in close proximity, obedience rates dramatically decreased. This emphasizes the impact of physical distance on individuals' inclination to follow orders, indicating that proximity plays a crucial role in shaping obedience behavior.

Milgram's Absent Experimenter Condition underscored the importance of physical proximity with authority figures in determining obedience levels. When the experimenter instructed the teacher by telephone from another room, obedience fell to 20.5%, revealing the diminished compliance when the authority figure was not physically present.

Milgram Experiment Study Notes

Milgram's Legacy and Influence on Modern Psychology

The Milgram experiment was significant for a number of reasons. Firstly, it highlighted the power of obedience to authority, even in situations where that obedience causes harm to others. This has important implications for understanding real-world situations, such as the Holocaust, where ordinary people were able to commit atrocities under the authority of a fascist regime.

Secondly, the experiment sparked a debate about the ethics of psychological research . Some critics argued that the study was unethical because it caused psychological distress to the participants. Others argued that the study's findings were too important to ignore, and that the benefits of the research outweighed the harm caused.

Stanley Milgram's study of obedience is widely recognized as one of the most influential experiments in the history of psychology. Although Milgram faced significant criticism for the ethical implications of his work, the study has had a lasting impact on our understanding of the power of authority and social influence.

Milgram's legacy can be seen in a variety of ways within the field of personality and social psychology. For example, his research has inspired a multitude of studies on the impact of social norms and conformity on behavior, as well as the importance of individual autonomy and free will in decision-making processes.

In addition, Milgram's influence can be seen in modern psychological research that utilizes variations of his study to explore new questions related to social influence and obedience. One such example is the Milgram Re-enactment, which sought to replicate the original study in a more ethical and controlled manner. This variation of the study found that individuals were still willing to administer shocks to the confederate, albeit at lower levels than in Milgram's original study.

Milgram's work has also had a significant impact on the way that researchers approach the treatment of participants in psychological experiments. The ethical concerns raised by Milgram's study led to a renewed focus on informed consent and debriefing procedures, ensuring that participants are aware of the potential risks and benefits of their involvement in research studies.

Milgram's legacy is one of both controversy and innovation. His study of obedience has contributed greatly to our understanding of human behavior and has served as a catalyst for important ethical discussions within the scientific community . While his work may continue to generate debate, there is no doubt that Milgram's contributions to the field of psychology have had a profound and lasting impact.

Milgram experimental conditions

Milgram's Relationship with Other Prominent Psychologists

Stanley Milgram was a highly influential figure in the field of social psychology, and his work has been cited by a number of other prominent psychologists throughout the years. One of his contemporaries, Albert Bandura, was also interested in the power of social influence and developed the theory of social learning , which explored the ways in which people learn from one another and their environments.

Gordon Allport was another important figure in the field of social psychology, known for his work on personality and prejudice. Allport's research was highly influential in shaping Milgram's own understanding of social influence and obedience.

Milgram's infamous obedience studies demonstrated how individuals could be led to obey authority figures and commit acts that violated their own moral codes. Zimbardo's Stanford Prison Experiment similarly showed how individuals could adopt new identities and exhibit aggressive and abusive behavior when placed in positions of power. Both studies highlight the importance of social context in shaping behavior and have had a significant impact on our understanding of the role of situational factors in human behavior.

Jerome Bruner, another influential psychologist , was known for his work on cognitive psychology and the importance of active learning in education. Although Bruner's work was not directly related to Milgram's study of obedience, his emphasis on the importance of individual autonomy and active learning aligns with some of the key themes in Milgram's work.

Roger Brown, a psychologist known for his research on language and cognitive developmen t, also shared some common ground with Milgram in terms of their interest in human behavior and social influence. Finally, Solomon Asch , another prominent psychologist, conducted important research on conformity that helped to lay the groundwork for Milgram's own study of obedience.

Milgram's work was highly influential and contributed significantly to the field of social psychology. His relationship with other prominent psychologists reflects the collaborative and interdisciplinary nature of psychological research and highlights the ways in which researchers build upon one another's work over time.

Milgram experiment advert

Criticisms of the Milgram Experiment

Despite its significance, the Milgram experiment has been heavily criticized by some psychologists. One of the main criticisms is that the study lacked ecological validity - that is, it didn't accurately reflect real-world situations. Critics argue that participants in the study knew that they were taking part in an experiment, and that this affected their behavior.

Another criticism is that the experiment caused psychological distress to the participants. Some argue that the experimenter put too much pressure on the participants to continue administering shocks, and that this caused lasting psychological harm.

Shock level increase

The Impact of Milgram's Research on Social Psychology

The Milgram experiment, conducted at Yale University in 1961, shocked the world with its findings on obedience to authority. Despite its groundbreaking contribution to the field of personality and social psychology, the study has also faced significant criticism for its treatment of participants.

Critics have raised concerns about the potential psychological harm inflicted on participants, who were led to believe that they were administering painful electric shocks to a real victim. Nevertheless, the Milgram experiment remains a critical turning point in the history of experiments with people.

It has had a profound impact on psychology, inspiring numerous studies that continue to shed light on obedience, conformity, and group dynamics. It has also sparked important debates about the ethics of psychological research and raised awareness of the importance of protecting the rights and well-being of research participants .

Participant in the Stanley Milgram experiment

Real-Life Examples of Obedience Leading to Human Catastrophe

Stanley Milgram's obedience experiments have had profound implications for understanding human behavior, especially in contexts where obedience to authority might have contributed to catastrophic outcomes. Here are seven historical examples that resonate with Milgram's findings:

  • Nazi Germany : The obedience to authority during the Holocaust, where individuals followed orders to commit atrocities, can be understood through Milgram's experiments. The willingness to administer "lethal shocks" to human subjects reflects how ordinary people can commit heinous acts under authoritative pressure.
  • My Lai Massacre : American soldiers massacred hundreds of Vietnamese civilians during the Vietnam War. Milgram's work helps explain how soldiers obeyed orders despite the moral implications, emphasizing the power of authority in a difficult situation.
  • Rwandan Genocide : The obedience to ethnic propaganda and authority figures led to the mass killings in Rwanda. Milgram's experiments shed light on how obedience can override personal judgment, leading to an unexpected outcome.
  • Jonestown Massacre : Followers of Jim Jones obeyed his orders to commit mass suicide. Milgram's findings on obedience help explain how charismatic leaders can exert control over their followers, even to the point of death.
  • Chernobyl Disaster : The obedience to flawed protocols and disregard for safety by the plant operators contributed to the catastrophe. Milgram's work illustrates how obedience to procedures and hierarchy can lead to disaster.
  • Iraq War - Abu Ghraib Prison Abuse : The abuse of prisoners by U.S. military personnel can be linked to obedience to authority, a phenomenon explored in Milgram's experiments. The willingness to inflict harm under orders reflects the human participants' compliance in his studies.
  • Financial Crisis of 2008 : Blind obedience to corporate culture and regulatory authorities contributed to unethical practices leading to the global financial meltdown. Milgram's insights into obedience help explain how organizational pressures can lead to widespread harm.

These examples demonstrate the pervasive influence of obedience in various historical and contemporary contexts. Milgram's experiments, documented in various Stanley Milgram Papers and the Journal of Abnormal and Social Psychology , continue to be a critical reference in understanding human behavior.

The documentary film "Shocking Obedience" further explores these themes, emphasizing the universal relevance of Milgram's work. His experiments remind us of the human capacity for obedience , even in the face of morally reprehensible orders, and continue to provoke reflection on our own susceptibilities.

Milgram demonstrating Shocking Obedience

Key Takeaways

  • The Milgram experiment was a famous and controversial study in psychology that examined people's willingness to obey authority.
  • Participants in the study were instructed to administer electric shocks to a learner, even when that obedience caused harm to the learner.
  • The results of the study showed that the majority of participants continued to administer shocks to the maximum level, even when they believed that the shocks were causing serious harm.
  • The study has been heavily criticized for lacking ecological validity and causing psychological distress to participants.
  • Despite the criticisms, the Milgram experiment has had a lasting impact on psychology and has inspired numerous other studies on obedience and authority.

In conclusion, the Milgram experiment remains an important and controversial study in the field of psychology. Its findings continue to influence our understanding of obedience to authority.

Further Reading on the Milgram Experiment

These papers offer a comprehensive view of Milgram's experiment and its implications, highlighting the profound effects of authority on human behaviour.

1. Stanley Milgram and the Obedience Experiment by C. Helm, M. Morelli (1979)

This paper delves into Milgram's experimen t, revealing the significant control the state has over individuals, as evidenced by their willingness to administer painful shocks to an innocent victim.

2. Credibility and Incredulity in Milgram’s Obedience Experiments: A Reanalysis of an Unpublished Test by G. Perry, A. Brannigan, R. Wanner, H. Stam (2019)

This study reanalyzes an unpublished test from Milgram's experiment , suggesting that participants' belief in the pain being inflicted influenced their level of obedience.

3. The Man Who Shocked the World: The Life and Legacy of Stanley Milgram by R. Persaud (2005)

Persaud's paper discusses the profound impact of Milgram's experiments on our understanding of human behavior , particularly the willingness of people to follow scientific authority.

4. Replicating Milgram: Would people still obey today? by J. Burger (2009)

Burger's study replicates Milgram's Experiment 5 , finding slightly lower obedience rates than 45 years earlier, with gender showing no significant influence on obedience.

5. Personality predicts obedience in a Milgram paradigm. by L. Bègue, J. Beauvois, D. Courbet, Dominique Oberlé, J. Lepage, Aaron A. Duke (2015)

This research explores how personality traits like conscientiousness and agreeableness, along with political orientation and social activism, can predict obedience in Milgram-like experiments.

These papers offer a comprehensive view of Milgram's experiment and its implications, highlighting the profound effects of authority on human behavior .

milgram experiment factors

Enhance Learner Outcomes Across Your School

Download an Overview of our Support and Resources

We'll send it over now.

Please fill in the details so we can send over the resources.

What type of school are you?

We'll get you the right resource

Is your school involved in any staff development projects?

Are your colleagues running any research projects or courses?

Do you have any immediate school priorities?

Please check the ones that apply.

milgram experiment factors

Download your resource

Thanks for taking the time to complete this form, submit the form to get the tool.

Encyclopedia Britannica

  • History & Society
  • Science & Tech
  • Biographies
  • Animals & Nature
  • Geography & Travel
  • Arts & Culture
  • Games & Quizzes
  • On This Day
  • One Good Fact
  • New Articles
  • Lifestyles & Social Issues
  • Philosophy & Religion
  • Politics, Law & Government
  • World History
  • Health & Medicine
  • Browse Biographies
  • Birds, Reptiles & Other Vertebrates
  • Bugs, Mollusks & Other Invertebrates
  • Environment
  • Fossils & Geologic Time
  • Entertainment & Pop Culture
  • Sports & Recreation
  • Visual Arts
  • Demystified
  • Image Galleries
  • Infographics
  • Top Questions
  • Britannica Kids
  • Saving Earth
  • Space Next 50
  • Student Center
  • Introduction

Education and national conformity studies

Obedience experiments.

  • Later experiments and publications

Stanley Milgram

Stanley Milgram

Our editors will review what you’ve submitted and determine whether to revise the article.

  • Academia - Stanley Milgram: the Obedience Studies in Social-Societal Context
  • American Scientist - Milgram's Progress
  • Harvard University - Department of Psychology - Biogra[hy of Stanley Milgram
  • GoodTherapy - Biography of Stanley Milgram
  • The New York Times - Dr. Stanley MilGram,51, is Daed - Studied Obidience to Authority
  • Table Of Contents

Stanley Milgram (born August 15, 1933, New York City , New York , U.S.—died December 20, 1984, New York City) was an American social psychologist known for his controversial and groundbreaking experiments on obedience to authority. Milgram’s obedience experiments, in addition to other studies that he carried out during his career, generally are considered to have provided important insight into human social behaviour, particularly conformity and social pressure. See also Milgram experiment .

Milgram was born and raised in the Bronx, the second of three children in a working-class Jewish family. As a youth, he was an exceptional student, with interests in science and the arts. At Queens College (later part of the City University of New York [CUNY]), he studied political science , in addition to taking courses in art, literature, and music. In 1953, following his third year at the college, he toured Europe and became increasingly interested in international relations . He was accepted into the graduate program in international affairs at Columbia University . However, in 1954, after completing a bachelor’s degree in political science at Queens College, Milgram instead began graduate studies in the social relations department at Harvard University .

At Harvard, Milgram took classes with leading social psychologists of the day, including Gordon Allport , Jerome Bruner , Roger Brown, and Solomon Asch , all of whom greatly influenced the direction of Milgram’s academic career. Of particular interest to Milgram were Asch’s conformity experiments, which showed that individual behaviour can be influenced by group behaviour , with individuals conforming to group perspectives, even when choices made by the group are obviously incorrect. Milgram set out to apply Asch’s group technique, with several variations, to the study of conformity on a national level, seeking to explore national stereotypes . He focused initially on the United States and Norway and later added France, using his connections at Harvard to travel to Oslo and Paris to establish study groups there. He used an auditory task to measure conformity, with participants in closed booths asked to distinguish between the lengths of two tones. Participants also heard the responses of other members of the study group, who supposedly occupied closed booths next to the participant (the group responses were recorded, and the other booths were empty). Milgram’s findings suggested that Americans and Norwegians differed little in conformity rates and that, of the three groups, the French were the least conforming.

milgram experiment factors

In 1960, after earning a Ph.D. from Harvard, Milgram accepted a position as assistant professor at Yale University . There he narrowed his research to obedience. Having been acutely aware from his youth of his Jewish heritage and the tragedies suffered by Jews in Europe during the Holocaust , he was interested in understanding the factors that led people to inflict harm on others. He designed an unprecedented experiment—later known as the Milgram experiment —whereby study subjects, who believed that they were participating in a learning experiment about punishment and memory , were instructed by an authority figure (the experimenter) to inflict seemingly painful shocks to a helpless victim (the learner). Both the experimenter and the learner were actors hired by Milgram, and the shocks were simulated via an authentic-appearing shock generator that was equipped with 30 voltage levels, increasing from 15 to 450 volts. Subjects were instructed by the experimenter to deliver a shock to the learner whenever the latter gave an incorrect answer to a question. With each incorrect response, shock intensity increased. At predetermined voltage levels, the learner (usually in a separate room) either banged on the adjoining wall, cried out in pain and pleaded with the participant to stop, or complained about a fictitious heart condition.

Prior to carrying out the experiments, Milgram and Yale psychology students whom he polled about possible outcomes of such a study predicted that only a very small percentage (from 0 to 3 percent) of people would inflict the most-extreme-intensity shock. Hence, Milgram was surprised with the results of early pilot studies, in which most participants continued through to the extreme 450-volt limit. The first official experiments carried out by Milgram in 1961 yielded similar results—26 out of 40 men recruited for the study proved to be fully obedient to the experimenter, delivering shocks through 450 volts. Variations in the experimental design showed that obedience was highest when the learner was in a separate room, as opposed to being in close proximity to the subject (e.g., in the same room or near enough to touch). Subjects persisted in their obedience despite verbally expressing their disapproval of continuing with the shocks.

Milgram suspected that subjects struggled to disengage from the experiment because of its incremental (“slippery slope”) progression—small demands, seemingly benign , became increasingly adverse. Subjects also may have been readily conforming, seeing themselves as inferior to the experimenter in their knowledge of learning, or they may have viewed themselves as being free of responsibility, simply carrying out the experimenter’s commands.

milgram experiment factors

Although thought-provoking, the experiments and their findings were highly controversial. The situation placed extreme stress on the subjects, some of whom experienced nervous laughter that culminated in seizures. In debriefing , Milgram did not reveal the full truth about the experiments to his subjects, leaving some to think that they really had shocked another person; it was not until many months later that subjects learned the true nature of the experiments. The validity of the findings also was later drawn into question by reports claiming that some participants suspected that they were the subjects being studied, with the aim of the study being to see how far they would obey the experimenter.

milgram experiment factors

Reference Library

Collections

  • See what's new
  • All Resources
  • Student Resources
  • Assessment Resources
  • Teaching Resources
  • CPD Courses
  • Livestreams

Study notes, videos, interactive activities and more!

Psychology news, insights and enrichment

Currated collections of free resources

Browse resources by topic

  • All Psychology Resources

Resource Selections

Currated lists of resources

Study Notes

Explanations for Obedience - Milgram (1963)

Last updated 22 Mar 2021

  • Share on Facebook
  • Share on Twitter
  • Share by Email

Milgram (1963) conducted one of the most famous and influential psychological investigations of obedience. He wanted to find out if ordinary American citizens would obey an unjust order from an authority figure and inflict pain on another person because they were instructed to.

Milgram’s sample consisted of 40 male participants from a range of occupations and backgrounds. The participants were all volunteers who had responded to an advert in a local paper, which offered $4.50 to take part in an experiment on ‘punishment and learning’.

The 40 participants were all invited to a laboratory at Yale University and upon arrival they met with the experimenter and another participant, Mr Wallace, who were both confederates.

The experimenter explained that one person would be randomly assigned the role of teacher and the other, a learner. However, the real participant was always assigned the role of teacher. The experimenter explained that the teacher, the real participant, would read the learner a series of word pairs and then test their recall. The learner, who was positioned in an adjacent room, would indicate his choice using a system of lights. The teacher was instructed to administer an electric shock ever time the learner made a mistake and to increase the voltage after each mistake.

The teacher watched the learner being strapped to the electric chair and was given a sample electric shock to convince them that the procedure was real. The learner wasn’t actually strapped to the chair and gave predetermined answers to the test. As the electric shocks increased the learner’s screams, which were recorded, became louder and more dramatic. At 180 volts the learner complained of a weak heart. At 300 volts he banged on the wall and demanded to leave and at 315 volts he became silent, to give the illusions that was unconscious, or even dead.

The experiment continued until the teacher refused to continue, or 450 volts was reached. If the teacher tried to stop the experiment, the experimenter would respond with a series of prods, for example: ‘The experiment requires that you continue.’ Following the experiment the participants were debriefed.

Milgram found that all of the real participants went to at least 300 volts and 65% continued until the full 450 volts. He concluded that under the right circumstances ordinary people will obey unjust orders.

Milgram’s study has been heavily criticised for breaking numerous ethical guidelines, including: deception , right to withdraw and protection from harm.

Milgram deceived his participants as he said the experiment was on ‘punishment and learning’, when in fact he was measuring obedience, and he pretended the learner was receiving electric shocks. In addition, it was very difficult for participants with withdraw from the experiment, as the experimenter prompted the participants to continue. Finally, many of the participants reported feeling exceptionally stressed and anxious while taking part in the experiment and therefore they were not protect from psychological harm. This is an issue, as Milgram didn’t respect his participants, some of whom felt very guilt following the experiment, knowing that they could have harmed another person. However, it must be noted that it was essential for Milgram to deceive his participants and remove their right to withdraw to test obedience and produce valid results. Furthermore, he did debrief his participants following the experiment and 83.7% of participants said that they were happy to have taken part in the experiment and contribute to scientific research.

Milgram’s study has been criticised for lacking ecological validity. Milgram tested obedience in a laboratory, which is very different to real-life situations of obedience, where people are often asked to follow more subtle instructions, rather than administering electric shocks. As a result we are unable to generalise his findings to real life situations of obedience and cannot conclude that people would obey less severe instructions in the same way.

Finally, Milgram’s research lacked population validity. Milgram used a bias sample of 40 male volunteers, which means we are unable to generalise the results to other populations, in particular females, and cannot conclude if female participants would respond in a similar way.

  • Ethical Issues
  • Right to withdraw
  • Protection from harm

You might also like

milgram experiment factors

Great lesson starter for obedience!

16th February 2016

Conformity - Asch (1951)

Conformity to social roles as investigated by zimbardo, dispositional explanation for obedience: authoritarian personality, dispositional explanation for obedience: the authoritarian personality.

Quizzes & Activities

Resistance to Social Change

Video: the authority hoax - you don't know what you would do.

1st November 2016

Lights, Camera, Action! A Shocking Rendition of Milgram

28th January 2017

Our subjects

  • › Criminology
  • › Economics
  • › Geography
  • › Health & Social Care
  • › Psychology
  • › Sociology
  • › Teaching & learning resources
  • › Student revision workshops
  • › Online student courses
  • › CPD for teachers
  • › Livestreams
  • › Teaching jobs

Boston House, 214 High Street, Boston Spa, West Yorkshire, LS23 6AD Tel: 01937 848885

  • › Contact us
  • › Terms of use
  • › Privacy & cookies

© 2002-2024 Tutor2u Limited. Company Reg no: 04489574. VAT reg no 816865400.

12.4 Conformity, Compliance, and Obedience

Learning objectives.

By the end of this section, you will be able to:

  • Explain the Asch effect
  • Define conformity and types of social influence
  • Describe Stanley Milgram’s experiment and its implications
  • Define groupthink, social facilitation, and social loafing

In this section, we discuss additional ways in which people influence others. The topics of conformity, social influence, obedience, and group processes demonstrate the power of the social situation to change our thoughts, feelings, and behaviors. We begin this section with a discussion of a famous social psychology experiment that demonstrated how susceptible humans are to outside social pressures.

Solomon Asch conducted several experiments in the 1950s to determine how people are affected by the thoughts and behaviors of other people. In one study, a group of participants was shown a series of printed line segments of different lengths: a, b, and c ( Figure 12.17 ). Participants were then shown a fourth line segment: x. They were asked to identify which line segment from the first group (a, b, or c) most closely resembled the fourth line segment in length.

Each group of participants had only one true, naïve subject. The remaining members of the group were confederates of the researcher. A confederate is a person who is aware of the experiment and works for the researcher. Confederates are used to manipulate social situations as part of the research design, and the true, naïve participants believe that confederates are, like them, uninformed participants in the experiment. In Asch’s study, the confederates identified a line segment that was obviously shorter than the target line—a wrong answer. The naïve participant then had to identify aloud the line segment that best matched the target line segment.

How often do you think the true participant aligned with the confederates’ response? That is, how often do you think the group influenced the participant, and the participant gave the wrong answer? Asch (1955) found that 76% of participants conformed to group pressure at least once by indicating the incorrect line. Conformity is the change in a person’s behavior to go along with the group, even if he does not agree with the group. Why would people give the wrong answer? What factors would increase or decrease someone giving in or conforming to group pressure?

The Asch effect is the influence of the group majority on an individual’s judgment.

What factors make a person more likely to yield to group pressure? Research shows that the size of the majority, the presence of another dissenter, and the public or relatively private nature of responses are key influences on conformity.

  • The size of the majority: The greater the number of people in the majority, the more likely an individual will conform. There is, however, an upper limit: a point where adding more members does not increase conformity. In Asch’s study, conformity increased with the number of people in the majority—up to seven individuals. At numbers beyond seven, conformity leveled off and decreased slightly (Asch, 1955).
  • The presence of another dissenter: If there is at least one dissenter, conformity rates drop to near zero (Asch, 1955).
  • The public or private nature of the responses: When responses are made publicly (in front of others), conformity is more likely; however, when responses are made privately (e.g., writing down the response), conformity is less likely (Deutsch & Gerard, 1955).

The finding that conformity is more likely to occur when responses are public than when they are private is the reason government elections require voting in secret, so we are not coerced by others ( Figure 12.18 ). The Asch effect can be easily seen in children when they have to publicly vote for something. For example, if the teacher asks whether the children would rather have extra recess, no homework, or candy, once a few children vote, the rest will comply and go with the majority. In a different classroom, the majority might vote differently, and most of the children would comply with that majority. When someone’s vote changes if it is made in public versus private, this is known as compliance. Compliance can be a form of conformity. Compliance is going along with a request or demand, even if you do not agree with the request. In Asch’s studies, the participants complied by giving the wrong answers, but privately did not accept that the obvious wrong answers were correct.

Now that you have learned about the Asch line experiments, why do you think the participants conformed? The correct answer to the line segment question was obvious, and it was an easy task. Researchers have categorized the motivation to conform into two types: normative social influence and informational social influence (Deutsch & Gerard, 1955).

In normative social influence , people conform to the group norm to fit in, to feel good, and to be accepted by the group. However, with informational social influence , people conform because they believe the group is competent and has the correct information, particularly when the task or situation is ambiguous. What type of social influence was operating in the Asch conformity studies? Since the line judgment task was unambiguous, participants did not need to rely on the group for information. Instead, participants complied to fit in and avoid ridicule, an instance of normative social influence.

An example of informational social influence may be what to do in an emergency situation. Imagine that you are in a movie theater watching a film and what seems to be smoke comes in the theater from under the emergency exit door. You are not certain that it is smoke—it might be a special effect for the movie, such as a fog machine. When you are uncertain you will tend to look at the behavior of others in the theater. If other people show concern and get up to leave, you are likely to do the same. However, if others seem unconcerned, you are likely to stay put and continue watching the movie ( Figure 12.19 ).

How would you have behaved if you were a participant in Asch’s study? Many students say they would not conform, that the study is outdated, and that people nowadays are more independent. To some extent this may be true. Research suggests that overall rates of conformity may have reduced since the time of Asch’s research. Furthermore, efforts to replicate Asch’s study have made it clear that many factors determine how likely it is that someone will demonstrate conformity to the group. These factors include the participant’s age, gender, and socio-cultural background (Bond & Smith, 1996; Larsen, 1990; Walker & Andrade, 1996).

Link to Learning

Watch this video of a replication of the Asch experiment to learn more.

Stanley Milgram’s Experiment

Conformity is one effect of the influence of others on our thoughts, feelings, and behaviors. Another form of social influence is obedience to authority. Obedience is the change of an individual’s behavior to comply with a demand by an authority figure. People often comply with the request because they are concerned about a consequence if they do not comply. To demonstrate this phenomenon, we review another classic social psychology experiment.

Stanley Milgram was a social psychology professor at Yale who was influenced by the trial of Adolf Eichmann, a Nazi war criminal. Eichmann’s defense for the atrocities he committed was that he was “just following orders.” Milgram (1963) wanted to test the validity of this defense, so he designed an experiment and initially recruited 40 men for his experiment. The volunteer participants were led to believe that they were participating in a study to improve learning and memory. The participants were told that they were to teach other students (learners) correct answers to a series of test items. The participants were shown how to use a device that they were told delivered electric shocks of different intensities to the learners. The participants were told to shock the learners if they gave a wrong answer to a test item—that the shock would help them to learn. The participants believed they gave the learners shocks, which increased in 15-volt increments, all the way up to 450 volts. The participants did not know that the learners were confederates and that the confederates did not actually receive shocks.

In response to a string of incorrect answers from the learners, the participants obediently and repeatedly shocked them. The confederate learners cried out for help, begged the participant teachers to stop, and even complained of heart trouble. Yet, when the researcher told the participant-teachers to continue the shock, 65% of the participants continued the shock to the maximum voltage and to the point that the learner became unresponsive ( Figure 12.20 ). What makes someone obey authority to the point of potentially causing serious harm to another person?

Several variations of the original Milgram experiment were conducted to test the boundaries of obedience. When certain features of the situation were changed, participants were less likely to continue to deliver shocks (Milgram, 1965). For example, when the setting of the experiment was moved to an off-campus office building, the percentage of participants who delivered the highest shock dropped to 48%. When the learner was in the same room as the teacher, the highest shock rate dropped to 40%. When the teachers’ and learners’ hands were touching, the highest shock rate dropped to 30%. When the researcher gave the orders by phone, the rate dropped to 23%. These variations show that when the humanity of the person being shocked was increased, obedience decreased. Similarly, when the authority of the experimenter decreased, so did obedience.

This case is still very applicable today. What does a person do if an authority figure orders something done? What if the person believes it is incorrect, or worse, unethical? In a study by Martin and Bull (2008), midwives privately filled out a questionnaire regarding best practices and expectations in delivering a baby. Then, a more senior midwife and supervisor asked the junior midwives to do something they had previously stated they were opposed to. Most of the junior midwives were obedient to authority, going against their own beliefs. Burger (2009) partially replicated this study. He found among a multicultural sample of women and men that their levels of obedience matched Milgram's research. Doliński et al. (2017) performed a replication of Burger's work in Poland and controlled for the gender of both participants and learners, and once again, results that were consistent with Milgram's original work were observed.

When in group settings, we are often influenced by the thoughts, feelings, and behaviors of people around us. Whether it is due to normative or informational social influence, groups have power to influence individuals. Another phenomenon of group conformity is groupthink. Groupthink is the modification of the opinions of members of a group to align with what they believe is the group consensus (Janis, 1972). In group situations, the group often takes action that individuals would not perform outside the group setting because groups make more extreme decisions than individuals do. Moreover, groupthink can hinder opposing trains of thought. This elimination of diverse opinions contributes to faulty decision by the group.

Groupthink in the U.S. Government

There have been several instances of groupthink in the U.S. government. One example occurred when the United States led a small coalition of nations to invade Iraq in March 2003. This invasion occurred because a small group of advisors and former President George W. Bush were convinced that Iraq represented a significant terrorism threat with a large stockpile of weapons of mass destruction at its disposal. Although some of these individuals may have had some doubts about the credibility of the information available to them at the time, in the end, the group arrived at a consensus that Iraq had weapons of mass destruction and represented a significant threat to national security. It later came to light that Iraq did not have weapons of mass destruction, but not until the invasion was well underway. As a result, 6000 American soldiers were killed and many more civilians died. How did the Bush administration arrive at their conclusions? View this video of Colin Powell, 10 years after his famous United Nations speech, discussing the information he had at the time that his decisions were based on. ("CNN Official Interview: Colin Powell now regrets UN speech about WMDs," 2010).

Do you see evidence of groupthink?

Why does groupthink occur? There are several causes of groupthink, which makes it preventable. When the group is highly cohesive, or has a strong sense of connection, maintaining group harmony may become more important to the group than making sound decisions. If the group leader is directive and makes his opinions known, this may discourage group members from disagreeing with the leader. If the group is isolated from hearing alternative or new viewpoints, groupthink may be more likely. How do you know when groupthink is occurring?

There are several symptoms of groupthink including the following:

  • perceiving the group as invulnerable or invincible—believing it can do no wrong
  • believing the group is morally correct
  • self-censorship by group members, such as withholding information to avoid disrupting the group consensus
  • the quashing of dissenting group members’ opinions
  • the shielding of the group leader from dissenting views
  • perceiving an illusion of unanimity among group members
  • holding stereotypes or negative attitudes toward the out-group or others’ with differing viewpoints (Janis, 1972)

Given the causes and symptoms of groupthink, how can it be avoided? There are several strategies that can improve group decision making including seeking outside opinions, voting in private, having the leader withhold position statements until all group members have voiced their views, conducting research on all viewpoints, weighing the costs and benefits of all options, and developing a contingency plan (Janis, 1972; Mitchell & Eckstein, 2009).

Group Polarization

Another phenomenon that occurs within group settings is group polarization. Group polarization (Teger & Pruitt, 1967) is the strengthening of an original group attitude after the discussion of views within a group. That is, if a group initially favors a viewpoint, after discussion the group consensus is likely a stronger endorsement of the viewpoint. Conversely, if the group was initially opposed to a viewpoint, group discussion would likely lead to stronger opposition. Group polarization explains many actions taken by groups that would not be undertaken by individuals. Group polarization can be observed at political conventions, when platforms of the party are supported by individuals who, when not in a group, would decline to support them. Recently, some theorists have argued that group polarization may be partly responsible for the extreme political partisanship that seems ubiquitous in modern society. Given that people can self-select media outlets that are most consistent with their own political views, they are less likely to encounter opposing viewpoints. Over time, this leads to a strengthening of their own perspective and of hostile attitudes and behaviors towards those with different political ideals. Remarkably, political polarization leads to open levels of discrimination that are on par with, or perhaps exceed, racial discrimination (Iyengar & Westwood, 2015). A more everyday example is a group’s discussion of how attractive someone is. Does your opinion change if you find someone attractive, but your friends do not agree? If your friends vociferously agree, might you then find this person even more attractive?

Social traps refer to situations that arise when individuals or groups of individuals behave in ways that are not in their best interest and that may have negative, long-term consequences. However, once established, a social trap is very difficult to escape. For example, following World War II, the United States and the former Soviet Union engaged in a nuclear arms race. While the presence of nuclear weapons is not in either party's best interest, once the arms race began, each country felt the need to continue producing nuclear weapons to protect itself from the other.

Social Loafing

Imagine you were just assigned a group project with other students whom you barely know. Everyone in your group will get the same grade. Are you the type who will do most of the work, even though the final grade will be shared? Or are you more likely to do less work because you know others will pick up the slack? Social loafing involves a reduction in individual output on tasks where contributions are pooled. Because each individual's efforts are not evaluated, individuals can become less motivated to perform well. Karau and Williams (1993) and Simms and Nichols (2014) reviewed the research on social loafing and discerned when it was least likely to happen. The researchers noted that social loafing could be alleviated if, among other situations, individuals knew their work would be assessed by a manager (in a workplace setting) or instructor (in a classroom setting), or if a manager or instructor required group members to complete self-evaluations.

The likelihood of social loafing in student work groups increases as the size of the group increases (Shepperd & Taylor, 1999). According to Karau and Williams (1993), college students were the population most likely to engage in social loafing. Their study also found that women and participants from collectivistic cultures were less likely to engage in social loafing, explaining that their group orientation may account for this.

College students could work around social loafing or “free-riding” by suggesting to their professors use of a flocking method to form groups. Harding (2018) compared groups of students who had self-selected into groups for class to those who had been formed by flocking, which involves assigning students to groups who have similar schedules and motivations. Not only did she find that students reported less “free riding,” but that they also did better in the group assignments compared to those whose groups were self-selected.

Interestingly, the opposite of social loafing occurs when the task is complex and difficult (Bond & Titus, 1983; Geen, 1989). In a group setting, such as the student work group, if your individual performance cannot be evaluated, there is less pressure for you to do well, and thus less anxiety or physiological arousal (Latané, Williams, & Harkens, 1979). This puts you in a relaxed state in which you can perform your best, if you choose (Zajonc, 1965). If the task is a difficult one, many people feel motivated and believe that their group needs their input to do well on a challenging project (Jackson & Williams, 1985).

Deindividuation

Another way that being part of a group can affect behavior is exhibited in instances in which deindividuation occurs. Deindividuation refers to situations in which a person may feel a sense of anonymity and therefore a reduction in accountability and sense of self when among others. Deindividuation is often pointed to in cases in which mob or riot-like behaviors occur (Zimbardo, 1969), but research on the subject and the role that deindividuation plays in such behaviors has resulted in inconsistent results (as discussed in Granström, Guvå, Hylander, & Rosander, 2009).

Table 12.2 summarizes the types of social influence you have learned about in this chapter.

Type of Social Influence Description
Conformity Changing your behavior to go along with the group even if you do not agree with the group
Compliance Going along with a request or demand
Normative social influence Conformity to a group norm to fit in, feel good, and be accepted by the group
Informational social influence Conformity to a group norm prompted by the belief that the group is competent and has the correct information
Obedience Changing your behavior to please an authority figure or to avoid aversive consequences
Groupthink Tendency to prioritize group cohesion over critical thinking that might lead to poor decision making; more likely to occur when there is perceived unanimity among the group
Group polarization Strengthening of the original group attitude after discussing views within a group
Social loafing Exertion of less effort by a person working in a group because individual performance cannot be evaluated separately from the group, thus causing performance decline on easy tasks
Deindividuation Group situation in which a person may feel a sense of anonymity and a resulting reduction in accountability and sense of self

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/psychology-2e/pages/1-introduction
  • Authors: Rose M. Spielman, William J. Jenkins, Marilyn D. Lovett
  • Publisher/website: OpenStax
  • Book title: Psychology 2e
  • Publication date: Apr 22, 2020
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/psychology-2e/pages/1-introduction
  • Section URL: https://openstax.org/books/psychology-2e/pages/12-4-conformity-compliance-and-obedience

© Jun 26, 2024 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

  • Utility Menu

University Logo

Department of Psychology

  • https://twitter.com/PsychHarvard
  • https://www.facebook.com/HarvardPsychology/
  • https://www.youtube.com/channel/UCFBv7eBJIQWCrdxPRhYft9Q
  • Participate

Stanley Milgram

Black and white photograph of Stanley Milgram as a young man. (Image Source: Harvard Faculty Registry)

In 1954 Harvard’s Department of Social Relations took the unusual step of admitting a bright young student who had not taken a single psychology course.  Fortunately Stanley Milgram was soon up to speed in social psychology, and in the course of his doctoral work at Harvard he conducted an innovative cross-cultural comparison of conformity in Norway and France under the guidance of Gordon Allport. 

Obtaining his Ph.D. in 1960, Milgram was ready to expand his work on conformity with a series of experiments on obedience to authority that he conducted as an assistant professor at Yale from 1960 to 1963. Inspired by Hannah Arendt’s report on the trial of Adolph Eichmann in Jerusalem, Milgram wondered whether her claims about “the banality of evil” – that evil acts can come from ordinary people following orders as they do their jobs – could be demonstrated in the lab. Milgram staged meticulously designed sham experiments in which subjects were ordered to administer dangerous shocks to fellow volunteers (in reality, the other volunteers were confederates and the shocks were fake). Contradicting the predictions of every expert he polled , Milgram found that more than seventy percent of the subjects administered what they thought might be fatal shocks to an innocent stranger. Collectively known as The Milgram Experiment, this groundbreaking work demonstrated the human tendency to obey commands issued by an authority figure, and more generally, the tendency for behavior to be controlled more by the demands of the situation than by idiosyncratic traits of the person.

The Milgram Experiment is one of the best-known social psychology studies of the 20th century. With this remarkable accomplishment under his belt, young Dr. Milgram returned to Harvard in 1963 to take a position as Assistant Professor of Social Psychology.

During this time at Harvard, Milgram undertook a new, equally innovative line of research, known as the Small World Experiment.  Milgram asked a sample of people to trace out a chain of personal connections to a designated stranger living thousands of miles away. His finding that most people could do this successfully with a chain of six or fewer links yielded the familiar expression “Six Degrees of Separation,” which later became the name of a play and a movie,  a source for the game “Six Degrees of Kevin Bacon,” and a major theme of Malcolm Gladwell’s 2000 bestseller,  The Tipping Point . The internet has made it easier to study social networks, and several decades after its discovery, the phenomenon has become a subject of intense new research.

Stanley Milgram left Harvard in 1967 to return to his hometown, New York City, accepting a position as head of the social psychology program at the Graduate Center of the City University of New York.  Tragically, he died of a heart attack at the age of 51. Milgram is listed as number 46 on the American Psychological Association’s list of the 100 most eminent psychologists of the 20th century.

Blass, T. (2002).  The man who shocked the world.  Psychology Today, Mar/Apr2002, 35(2), p. 68.

Eminent psychologists of the 20th century.  (July/August, 2002). Monitor on Psychology, 33(7), p.29.

Milgram, S. (1977).  The individual in a social world.  Reading, MA:  Addison-Wesley Publishing Co.

Role/Affiliation

Filter: role.

  • Faculty (26)
  • Affiliated Faculty (5)
  • Non-Ladder Faculty (19)
  • Visiting Scholars (4)
  • Fellows and Associates (71)
  • Graduate Students (82)
  • Historical Faculty (24)
  • Postdocs and Research Associates (60)
  • Professors Emeriti (4)

Filter: Research Program

  • Clinical Science (6)
  • Cognition, Brain, & Behavior (17)
  • Developmental Psychology (7)
  • Social Psychology (12)

Filter by alphabetical grouping of Last Name

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

The PMC website is updating on October 15, 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Scientific Reports

Logo of scirep

A novel experimental approach to study disobedience to authority

Emilie a. caspar.

1 Moral and Social Brain Lab, Department of Experimental Psychology, Ghent University, Henri Dunantlaan, 2, 9000 Ghent, Belgium

2 Center for Research in Cognition and Neuroscience, Université Libre de Bruxelles, Brussels, Belgium

Associated Data

Data are made available on OSF (DOI: https://doi.org/10.17605/OSF.IO/2BKJC ).

Fifty years after the experiments of Stanley Milgram, the main objective of the present paper is to offer a paradigm that complies with up-to-date ethical standards and that can be adapted to various scientific disciplines, ranging from sociology and (social) psychology to neuroscience. Inspired by subsequent versions of Milgram-like paradigms and by combining the strengths of each, this paper presents a novel experimental approach to the study of (dis)obedience to authority. Volunteers are recruited in pairs and take turns to be ‘agents’ or ‘victims’, making the procedure fully reciprocal. For each trial, the agents receive an order from the experimenter to send a real, mildly painful electric shock to the ‘victim’, thus placing participants in an ecological set-up and avoiding the use of cover stories. Depending on the experimental condition, ‘agents’ receive, or do not receive, a monetary gain and are given, or are not given, an aim to obey the experimenter’s orders. Disobedience here refers to the number of times ‘agents’ refused to deliver the real shock to the ‘victim’. As the paradigm is designed to fit with brain imaging methods, I hope to bring new insights and perspectives in this area of research.

Introduction

The experiment of Stanley Milgram is one of the most (in)famous in psychology 1 , within and beyond academia. Several variables account for this notoriety, such as the method used, the ethical issues associated, the enthralling results or the societal impact of the research topic. Milgram’s classical studies famously suggested a widespread willingness to obey authority, to the point of inflicting irreversible harm to another person just met a few minutes before. Beyond the studies of Milgram, the history of nations is also plagued by horrendous acts of obedience that have caused wars and the loss of countless lives 2 . History has fortunately shown that some individuals do resist the social constraint of receiving orders when their own morality is of greater importance than the social costs associated with defying orders (e.g., 3 , 4 ). To understand the factors that prevent an individual from complying with immoral orders, research on disobedience should focus on two main axes: (1) what social and situational factors support disobedience and (2) what individual differences support disobedience.

The first axe has already been largely investigated in past studies. From Milgram’s studies, important situational factors supporting disobedience have already been established 5 . For instance, disobedience increases if the experimenter is not physically present in the room or if two experimenters provide opposing views regarding the morality of the experiment. Subsequent versions and interpretations of Milgram’s studies 6 – 8 as well as historical research 4 , 9 also suggested the importance of several social (e.g. presence of a supporting group) and situational factors (e.g. family history, proximity with the ‘victim’, intensity of the pain; money) supporting resistance to immoral orders. However, the second axe regarding individual differences has been less systematically approached. A few studies 10 , 11 previously explored personality traits that may influence disobedience (e.g. empathic concern, risk-taking) but most of these studies, however, have used relatively weak and potentially biased methods, such as self-reported questionnaires and methods based on cover stories. These studies are not sufficient to explain why, in a given situation, some people will refuse immoral orders and rescue threatened human beings while others will comply with such orders. With the current literature on disobedience, we have no idea about which neuro-cognitive processes drive inter-individual differences regarding the degree of disobedience. This aim could be achieved by offering a novel experimental approach that would make it possible to use novel techniques that give us a more direct access to the functioning of the brain and cognition, such as functional near-infrared spectroscopy (fNIRS), electroencephalography or Magnetic Resonance Imagery (MRI). Regrettably, the original paradigm and those bearing close similarity are not adapted to reliably answer those questions as they were not designed to fit with neuroimaging measurements. By combining the strengths of previous work on disobedience into a single experimental paradigm and adapting it to fit with cognitive and brain imaging measurements, this novel experimental approach could help to better understand, together with individual, social, and cultural factors, which mechanisms make it possible for an individual to refuse to comply with immoral orders.

There were several challenges to consider in order to develop such a paradigm, both ethical and methodological. Studying obedience and resistance to immoral orders involves putting volunteers in a situation where they have to make a decision on whether or not to commit ‘immoral acts’ under orders. A balance has to be found between what is acceptable from an ethical perspective and what is necessary for the research question. Milgram’s studies on obedience raised undeniable ethical issues 12 – 14 , mostly associated with high stress and the use a cover story, which involves deception. Some variants of Milgram’s studies were realized with immersive virtual reality to prevent the ethical issues associated with Milgram’s paradigm 15 , but the transparency of the fake scenario presented to participants does not capture decision-making in an ecological set-up. Other Milgram-based variants, such as the 150-V method, appear to replicate Milgram’s results 16 with respect to the actual ethical standards, but methodological concerns are still present 17 as cover stories are still used, which lead to interpretation issues. Beyond ethical considerations, the use of deception also indeed involves a doubt about whether or not volunteers truly believed the cover story. As a consequence, a reasonable doubt remains on how to interpret the results and this is one of the main critics associated with Milgram’s studies and following versions. Recent work on the reports of Milgram’s volunteers suggested that there are no strong and reliable evidence that participants believed in the cover story 8 , 14 , 18 . Others suggested that since the stress of participants was visible on video recordings during the experiment (e.g. hand shaking, nervousness), this suggests that participants actually believed that they were torturing another human being 19 . However, this interpretation has been challenged by another study showing that participants can have physiological reactions to stress even in an obviously-fake experimental set-up 15 . These contrasting interpretations of Milgram’s studies actually reinforce the idea that results can hardly be interpreted when cover stories are used 20 . To answer those criticisms, a real scenario had thus to be created, where participants made decisions that have real consequences on another human being.

An additional challenge is that methods relying on the original paradigm of Milgram, such as the virtual reality version 15 or the 150-V method 16 are not adapted to neuroimaging measurements. More specifically, with such Milgram-like experimental approaches, only a single trial would be recorded for the entire experimental session, that is, when the volunteer stops the experiment (if this happens). For cognitive and neuroimaging data collection, a single trial per participant is not a reliable result, which requires the averaging of several trials to obtain a good signal-to-noise ratio.

Another challenge at the methodological and conceptual levels it that several experimenters 1 , 5 , 21 , 22 including myself 21 – 27 , noted that volunteers are extremely obedient when coming to an experiment. Personally, I have tested about 800 volunteers to investigate the mechanisms by which coercive instructions influence individual cognition and moral behaviors. For instance, by using behavioral, electrophysiological and neuroimaging methods, we have observed that when people obey orders to send real shocks to someone else, their sense of agency 23 , their feeling of responsibility 28 , empathy for the pain of the victim and interpersonal guilt 26 are attenuated compared to a situation where they are free to decide which action to execute. Out of 800 volunteers tested, only 27 disobeyed my orders (i.e. 3.3%): 21 for prosocial reasons (i.e. they refused to administer an electric shock to another individual), 3 by contradiction (i.e. by systematically pressing the other button, not matter the content of the order), and 3 for antisocial reasons (i.e. by administering shocks despite my order not to do so). Although convenient to study how obedience affects cognition, this rate is indubitably an issue when studying disobedience. If participants almost never disobey, we can’t study the mechanisms through which resistance to immoral orders may develop in a given situation. Several reasons for not disobeying the experimenter’s orders have been suggested. Some consider that being obedient is part of the human nature as massive and destructive obedience has been observed through countless historical events 2 . Another current view on the experiments of Milgram is that volunteers were actually happy to participate and to contribute to the acquisition of scientific data 17 , thus explaining the high obedience rate observed. This effect has been referred to as ‘engaged followership’ 29 . If that interpretation is correct, the volunteer’s willingness to come and help the experimenter acquiring scientific data creates an extra difficulty to obtain disobedience in an experimental setup. However, this interpretation is challenged by several studies reported by Milgram, which displayed a higher disobedience rate than his original study. For instance disobedience increases when the shocks’ receiver sits in the same room as the participant or when the authoritative experimenter is not physically present in the room 5 . If participants were indeed only guided by their willingness to help to acquire scientific data, this should be the case in any experimental set-up. As some studies involve a higher disobedience rate compared to the initial version of Milgram’s study 1 , they could thus, at a first glance, be used for studying disobedience. However, even if some versions of the initial study of Milgram offer a highly disobedience rate, thus making it possible to study the mechanisms through which resistance to immoral orders may develop in a given situation, these experimental set-ups are still not adapted for cognitive and neuroimaging measurements and still rely on the use of a cover story.

Taking all the presented challenges into account (i.e. not using cover stories to avoid interpretation issues; obtaining a fair rate of disobedience; using an experimental approach that also fits with cognitive and neuroimaging measurements; respecting ethical standards), the present paper presents a set of experiments that combine the strengths of past experimental work on (dis)obedience. Volunteers were openly involved and active (= real social situation) rather than having to act in fictitious scenarios (= imagined social situation, e.g. Slater et al., 2006). They were confronted with moral decisions to follow or not the orders from an experimenter to inflict a real painful shock to a ‘victim’ in exchange (or not) for a small monetary gain, thus avoiding the use of cover stories. Since the aim here is to develop a paradigm that could be used both in behavioral and neuroimaging studies, some basic characteristics had to be considered. For instance, to fit with a Magnetic Resonance Imagery (MRI) scanning environment, neither the ‘victim’, nor the experimenter were in the same room as the agent. A real-time video was thus used to display a video of the victim’s hand receiving shocks on the agent’s screen and headphones were used so the participant could hear the experimenter’s orders.

Another method to study disobedience would be to select participants who are more likely to disobey than others. Each volunteer was thus also asked to complete a series of personality questionnaires to evaluate if a specific profile is associated with a greater prosocial disobedience rate. Systematic post-experimental interviews were conducted at the end of each experiment in order to understand the decisions of volunteers to follow or not the orders of the experimenter and to ask them how they felt during the experiment.

Participants

A hundred eighty naive volunteers (94 females) were recruited in same gender dyads (= 90 dyads). During the recruitment procedure, I ensured that the participants in each dyad were neither close friends (by mixing people studying different academic courses), nor relatives. To estimate the sample size a priori, I calculated the total sample size based on an effect size f of (0.3). To achieve a power of 0.85 for this effect size, the estimated sample size was 168 for 6 groups 30 . I increased the sample size slightly to 180 in order to prevent loss of data in case of withdrawals. Volunteers were randomly assigned to one of the 6 variants of the task (N = 30/variant). One volunteer was not taken into account because they only played the role of the ‘victim’ to replace a participant who did not show up. No volunteers withdrew from the experiment. For the remaining 179 volunteers, the mean age was 22.63 years old (SD = 2.77, range:18–35). A Univariate ANOVA with Age as the dependent variable and Variant as the fixed factor confirmed that age of the volunteers did not differ between the different variant of the tasks ( p  > 0.1, BF 10  = 0.167). Volunteers received between €10 and €19.60 for their participation. All volunteers provided written informed consent prior to the experiment. The study was approved by the Ethics Committee of the Erasme Hospital (reference number: P2019/484). All methods were performed in accordance with the relevant guidelines and regulations.

Method and Material

Six experimental set-ups were created in a between-subject design. In all six set-ups, volunteers were invited by pairs. One person was assigned to start as agent and the other one to start as ‘victim’. Their roles were switched mid-way, ensuring reciprocity. Compared to the experimental design of Milgram, both volunteers were real participants, not confederates. The reciprocity also avoided volunteers to be stuck in the role of the person providing pain to the other, thus attenuating the potential psychological distress of being in a perpetrator role only. Volunteers were given the possibility to choose the role they wanted to start with. In the case none of them had a preference, role assignment was decided by a coin flip, but volunteers were reminded that they could still decide themselves. This procedure allows to ensure that participants do not think that this procedure is a trick.

Volunteers were first given the instructions of the task. Then, they signed the consent forms in front of each other, so both were aware of the other’s consent. The experimenter was never present in the same room, but rather gave the instructions through headphones. This was for two reasons. First, Milgram’s studies show that disobedience increases if the experimenter is not physically present in the room. Second, in the case of MRI scanning, the experimenter would not be able to give direct verbal instructions to the volunteers in the MRI room due to the high noise of the scanner. Here, agents were isolated in a room and were provided headphones to hear the experimenter’s instructions (see Fig.  1 ). They were told that this was done to avoid attentional interferences through the experimenter’s physical presence in the room. In this series of studies, instructions were pre-recorded but a real setup with a microphone connected to the headphones could also work. Pre-recordings allow perfect timing of the events, important for neuroimaging or electroencephalography recordings. The instructions were “ give a shock ” or “ don’t give a shock ”. To increase the authenticity of the procedure, each sentence was recorded 6 times with small variations in the voice and displayed randomly. In addition, the audio recordings included a background sound similar to interphone communications.

An external file that holds a picture, illustration, etc.
Object name is 41598_2021_2334_Fig1_HTML.jpg

Experimental setup. Schematic representation of the experimental setup. Volunteers were in different rooms. The experimenter was located in a third, separated room. The agent heard on a trial basis the orders of the experiment through headphones and had to decide to press the ‘SHOCK’ or ‘NO SHOCK’ button. A real-time camera feedback displayed the hand of the victim of the agent’s screen so to allow to keep track on the consequences of their actions.

Shocks were delivered using a constant current stimulator (Digitimer DS7A) connected to two electrodes placed on the back of victims’ left hand, visible to the agent through the camera display. Individual pain thresholds were determined for the two volunteers before starting the experiment. This threshold was determined by increasing stimulation in steps of 1 mA (Caspar et al., 2016). I approximated an appropriate threshold by asking a series of questions about their pain perception during the calibration (1. «  Is it uncomfortable?  »—2. «  Is it painful?  »—3. «  Could you cope with a maximum of 100 of these shocks?  »—4. «  Could I increase the threshold?  »). When roles were reversed, I briefly re-calibrated the pain threshold of the new victim by increasing the stimulation again from 0 in steps of 3 mA up to the previously determined threshold, to confirm that the initial estimate was still appropriate, and to allow re-familiarisation. The mean stimulation level selected by this procedure was 36.3 mA (SD = 17.5, V = 300, pulse duration: 200 µs). I chose this instead of other types of pain (e.g. financial) because it produces a clear muscle twitch on the victim’s hand each time a shock is sent. This allows volunteers to have a clear and visible feedback of the consequences of their actions and to be fully aware that shocks were real.

There was a total of 96 trials per experimental condition. In the coerced condition, the experimenter asked to give a shock in 64 trials and asked not to give a shock 32 trials. This ratio was chosen on the assumption that the volunteer’s willingness to refuse immoral orders would increase with the number of times they were instructed to inflict pain to the “victim”.

On each trial, a picture of two rectangles, a red one labelled ‘SHOCK’ and a green one labelled ‘NO SHOCK’, was displayed in the bottom left and right of the screen. The key-outcome mapping varied randomly on a trial-wise basis, but the outcome was always fully congruent with the mapping seen by the participant. Agents could then press one of the two buttons. Pressing the SHOCK key delivered a shock to the victim while pressing the NO SHOCK key did not deliver any shocks. This procedure of randomized button mapping allows to have a better control over motor preparation, an aspect that can be important for neuroimaging data.

In half of the variants of the task (i.e., 3/6), the “Aim” variants, participants were given a reason for obeying the orders of the experimenter, while this was not the case in the other half, the “No aim” variants. In the “No Aim” variants, I did not provide any reasons for obeying to the participants and I simply explained the task. If participants asked about the aim, I simply told them that they would know at the end of the experiment, without providing further justifications. In the “Aim” variants, volunteers were told that researchers observed a specific brain activity in the motor cortex in another study when participants were given instructions. We explained that the present study was a control study to measure different aspects linked to motor activity when they press buttons, in order to see if the button pressing was related to brain activity measured over the motor cortex. To increase the veracity of the aim, electrodes were also placed on their fingers and connected to a real electromyography (EMG) apparatus to supposedly record their muscle activity. Volunteers were instructed to press the two buttons only with their right and left index fingers, as naturally as possible, and to avoid producing too ample movements to create clean EMG data. In the case volunteers asked if they really had to follow orders, I told them that for ethical reasons I could not force them to do anything, but that it would be better for the sake of the experiment. Telling them explicitly that they could disobey the orders would not be beneficial in the quest of studying ‘real’ disobedience.

In 4 out of 6 variants of the task, the “Free-choice” variants, a second experimental condition was used, the free-choice condition. In this condition, volunteers were told that they could freely decide in each trial to shock the ‘victim’ or not. In this condition, they did not receive instructions. In 4 out of 6 variants of the task, the “Monetary reward” variants, agents received a monetary reward of + €0.05 for each shock delivered. In the other 2 variants, volunteers were not rewarded for each shock delivered (i.e. “No monetary reward” variants). To resume, the 6 variants of the same task were the following: (1) No Aim + Monetary reward + Free-choice condition; (2) No Aim + No monetary reward + Free-choice condition; (3) Aim + Monetary reward + Free-choice condition; (4) Aim + No monetary reward + Free-choice condition; (5) No Aim + Monetary reward + No free-choice condition; (6) Aim + Monetary reward + No free-choice condition (see Table ​ Table1 1 ).

Schematic representation of each variant of the experimental task.

Variants of the taskAim for obedienceMonetary rewardFree-choice condition
Variant 1
Variant 2
Variant 3
Variant 4
Variant 5
Variant 6

Before the experimental session, volunteers filled in six questionnaires. Those questionnaires included (1) the Money Attitude Scale (e.g. “ I put money aside on a regular basis for the future ”) 31 , (2) the Moral Foundation Questionnaire (e.g. “ Whether or not someone showed a lack of respect for authority ”) 32 , (3) the Aggression-Submission-Conventionalism scale (e.g., “ We should believe what our leaders tell us ”) 33 , (4) the short dark triad scale (e.g., “ Most people can be manipulated ”) 34 , the Interpersonal Reactivity Index (e.g. “ When I see someone get hurt, I tend to remain calm ”) 35 . At the end of the experimental session, they were asked to fill in two more questionnaires: (1) A debriefing assessing what they felt during the experiment and the reasons for choosing to obey or disobey the orders of the experimenter (Supplementary Information S1) and (2) a questionnaire on social identification with the experimenter (e.g., “ I feel strong ties with this experimenter ”) 36 . At the end of the experiment a debriefing was conducted for each volunteer, separately. Volunteers were then paid, again separately.

General data analyses

Each result was analyzed with both frequentist and Bayesian statistics 37 . Bayesian statistics assess the likelihood of the data under both the null and the alternative hypothesis. BF 10 corresponds to the p (data| H 1 )/ p (data| H 0 ). Generally, a BF between 1/3 and 3 indicates that the data is similarly likely under the H 1 and H 0 , and that the data does not adjudicate which is more likely. A BF 10 below 1/3 or above 3 is interpreted as supporting H 0 and H 1 , respectively. For instance, BF 10  = 20 would mean that the data are 20 times more likely under H 1 than H 0 providing very strong support for H 1 , while BF 10  = 0.05 would mean that the data are 20 times more likely under H 0 than H 1 providing very strong support for H 0 38 . BF and p values were calculated using JASP 39 and the default priors implemented in JASP. All analyses were two-tailed.

Number of shocks given in the free-choice condition

In the free-choice condition, volunteers were told that they were entirely free to decide to deliver a shock or not to the ‘victim’ on each of the 96 free-choice trials. On average, agents administered shocks to the victim on 31.86% of the trials (SD = 34.98, minimum: 0%, maximum: 100%) in the free-choice condition, corresponding to 30.59/96 shocks. A paired-sample t-test indicated that agents delivered less frequently a shock in the free-choice condition than in the coerced condition (68.03%, SD = 41.11, t (119)  = -9.919, p  < 0.001, Cohen’s d = − 0.906, BF 10  = 1.987e + 14). This result supports the fact that individuals can inflict more harm to others when they obey orders than when they act freely.

Prosocial disobedience across variants

In the present study, I was interested in prosocial disobedience, that is, when agents refuse the orders of the experimenter to send a painful shock to the ‘victim’. Table ​ Table2 2 displays the number of volunteers who reported that they voluntarily disobeyed in each variant of the task.

Number of volunteers who reported that they voluntarily disobeyed the orders of the experimenter.

Variant 1Variant 2Variant 3Variant 4Variant 5Variant 6
Voluntary disobedience (‘Yes’)23/3024/308/3016/3024/3013/30

In this experiment, the main variable of interest was not to consider how many participants disobeyed in each variant only, but also how frequently they disobeyed. A percentage of prosocial disobedience was calculated for each volunteer, corresponding to the number of trials in which participants chose to disobey (i.e., sending no shocks while ordered by the experimenter to do so) divided by the total number of trials corresponding to the order to send a shock, multiplied by 100. I compared the prosocial disobedience rate across variants of the task, gender of participants and order of the role. I conducted a univariate ANOVA with prosocial disobedience as the dependent variable and Aim (aim given, no aim given), Monetary reward (+ €0.05 or not), Free-choice (presence or absence of a free-choice condition), Gender and Order of the Role (agent first, victim first) as fixed factors (see Fig.  2 ). Both frequentist and Bayesian statistics strongly supported a main effect of Aim (F (1,155)  = 14.248, p  < 0.001, η 2 partial  = 0.084, BF incl  = 158.806). Prosocial disobedience was lower when an aim for obedience was given to volunteers (20.4%, CI 95  = 12.8–28.1) than when no aim was given (43.3%, CI 95  = 35.6–51). Both frequentist and Bayesian statistics also supported a main effect of Monetary reward (F (1,155)  = 12.335, p  = 0.001, η 2 partial  = 0.074, BF incl  = 28.930). Prosocial disobedience was lower when a monetary reward was given for each shock (25.1%, CI 95  = 18.5–31.7) than when no monetary reward was given (45.4%, CI 95  = 35.9–54.8). The frequentist approach showed a main effect of Gender (F (1,155)  = 5.128, p  = 0.025, η 2 partial  = 0.032), with a lower prosocial disobedience rate for female volunteers (25.7%, CI 95  = 18.2–33.2) then for male volunteers (38%, CI 95  = 30–46). However, the Bayesian version of the same analysis revealed a lack of sensitivity (BF incl  = 0.871). All other main effects or interactions supported H 0 or a lack of sensitivity (all p s > 0.1 & BFs incl  ≥ 0.4.291E-7 & ≤ 1.178).

An external file that holds a picture, illustration, etc.
Object name is 41598_2021_2334_Fig2_HTML.jpg

Graphical representation of the percentages of prosocial disobedience in each variant of the task.

The following results report two-tailed Pearson correlations between prosocial disobedience and several other variables, including (1) the reasons given for disobeying, (2) the feeling of responsibility, badness and how sorry they experienced during the experiment, (3) the identification with the experimenter, (4) the perceived level of pain of the victim, (5) identification with the ‘victim’, and (6) individual differences measured through self-report questionnaires. I applied a False Discovery Rate (FDR) approach with the Benjamini and Hochberg method 40 to each p-value for each of those correlations but for the sake of clarity these variables are reported in different sub-sections.

Reasons for prosocial disobedience

All participants who reported that they voluntarily disobeyed the orders of the experimenter (N = 108) were presented a list of 10 reasons that they had to rate from “Not at all” to “Extremely” (see Supplementary Information S1). The reason ‘ I wanted to make more money ’ was only considered for the data of volunteers who had a variant with a monetary reward for each shock (N = 68). Both frequentist and Bayesian statistics showed that the percentage of prosocial disobedience positively correlated with moral reasons (r = 0.550, p FDR  < 0.001, BF 10  = 1.700e + 7), positively correlated with disobedience by contradiction (r = 0.329, p FDR  < 0.001, BF 10  = 47.53) and negatively correlated with the willingness to make more money (r = − 0.485, p FDR  < 0.001, BF 10  = 822.16). Other correlations were in favor of H 0 or were inconclusive (all p s FDR  > 0.076, all BFs 10  ≥ 0.120 & ≤ 1.446).

Feeling responsible, bad and sorry

Both frequentist and Bayesian statistics showed strong positive correlations between prosocial disobedience and how responsible (r = 0.299, p FDR  < 0.001, BF 10  = 343.98) and how bad (r = 0.301, p FDR  < 0.001, BF 10  = 384.65) they felt during the task (see Figs.  3 A and B). The more responsible and worse they felt during the task, the more they refused the order to send a shock to the ‘victim’. How sorry they felt was inconclusive ( p FDR  > 0.08, BF 10  = 0.929).

An external file that holds a picture, illustration, etc.
Object name is 41598_2021_2334_Fig3_HTML.jpg

Graphical representation of Pearson correlations between prosocial disobedience and ( A ) feeling of responsibility, ( B ) how bad agents felt during the task when they administered shocks to the ‘victim’, and ( C ) how painful they estimated the shock delivered to the ‘victim’ was. All tests were two-tailed.

Identification with the experimenter

Both frequentist and Bayesian statistics strongly supported H 0 regarding the relationship between prosocial disobedience and personal identification ( p FDR  > 0.5, BF 10  = 0.121) and bonding with the experimenter ( p FDR  > 0.5, BF 10  = 0.117). The relationship between the charisma of the experimenter and prosocial disobedience was also slightly in favor of H 0 ( p FDR  > 0.1, BF 10  = 0.530).

Estimated pain of the ‘victim’

The frequentist approach showed a positive correlation between the perceived pain of the ‘victim’ and prosocial disobedience (r = 0.189, p FDR  = 0.048). The higher they considered the ‘victim’ to be in pain, the more frequently they refused to deliver the shock. The Bayesian version of the same analysis slightly supported this relationship (BF 10  = 2.236), see Fig.  3 C.

Identification with the ‘victim’

In the post-session questionnaire, volunteers had to identify to what extent they considered that the other participant could be part of their group and to what extent they identified with the other participant. Both frequentist and Bayesian statistics strongly supported H 0 regarding the relationship between prosocial disobedience and the perception that the other participant could be part of one’s own group ( p FDR  > 0.8, BF 10  = 0.096). The relationship between prosocial disobedience and the identification with the other participant also slightly supported H 0 ( p FDR  > 0.1, BF 10  = 0.511).

Correlations between the behavior of pairs of participants

As we used a role reversal procedure, the behavior of those who were agents first could influence the behavior of those who turned agents afterwards. A Pearson correlation between prosocial disobedience of agents first and prosocial disobedience of victims who turned agents afterwards. The correlation was positive (r = 0.514, p  < 0.001, BF 10  = 60,068.704), suggesting participants who were agents second tend to act similarly as those who were agents first.

Individual differences associated with prosocial disobedience

Another approach to ensure a reliable prosocial disobedience rate when recruiting volunteers would be to target individuals with a profile that is most frequently associated with disobedient behaviors. Both frequentist and Bayesian statistics for exploratory correlations were two-tailed. Cronbach’s α for each subscale is presented in Supplementary Information S2. Both frequentist and Bayesian statistics showed a negative correlation between scores on the Authority subscale (r = -0.259, p FDR  < 0.001, BF 10  = 41.372) and the Purity subscale (r = -0.303, p FDR  < 0.001, BF 10  = 424.97) from the MFQ questionnaire. The lower volunteers scored on authority and purity, the higher was their prosocial disobedience rate. Other correlations were in favor of H 0 or were inconclusive (all p s FDR  ≥ 0.048, all BFs 10  ≥ 0.100 & ≤ 2.314).

Reasons for obedience

If participants reported that they did not voluntarily disobey the orders of the experimenter, they were asked in an open question to explain their decision to comply with those orders. After reading all the answers, three categories were extracted from the reasons provided: (1) ‘For science’ reasons; participants reported that they obeyed to allow reliable data acquisition (e.g., Participant 91: “ Pour ne pas fausser l’étude ”—English translation: “ To avoid biasing the stud y”); (2) ‘For respect of authority’ reasons; participants reported that they had to follow the orders of the authoritative figure (e.g., Participant 13: “ Pour moi c’est normal de suivre un ordre ”—English translation: “ In my opinion, it’s normal to follow an order ”), and (3) ‘For lack of side-effects’ reasons; participants reported that since the shocks delivered were calibrated on one’s own pain threshold, obeying orders to shock was not problematic (e.g., Participant 115: “ Douleur supportable pour l'autre, je n'ai accepté de faire subir que ce que j'aurais été prêt à subir moi-même ”—English translation: “ The pain was tolerable for the other participant, I have accepted to inflict the intensity of the pain that I would have been ready to undergo myself ”). An independent, naive judge classified the response of participants in one or several of those three established categories. Analyses of the frequencies revealed that the reason “For Science” was mentioned 31/70 times, the reason “For lack of side-effects” was mentioned 17/70 times and the reason “For respect of authority” was mentioned 31/70 times.

The aim of the present paper was to present a novel experimental approach to study (dis)obedience to immoral orders, by combining the strength of past experimental work and by adapting it to cognitive and neuroimaging measurements. Although other versions were proposed since Milgram’s studies, like a study in an immersive virtual environment 15 or the 150-V method 16 , some methodological concerns remained as those methods still involved cover stories or fake experimental set-ups. Here, the experimental approach was significantly different as it was based on an entirely transparent method that involved the administration of real electric shocks to another individual. This approach has the advantage to solve some of the main ethical and methodological concerns associated with the use of cover stories. It also has the advantage that it be can used both to study how social and situational factors influence disobedience as well as individual factors. For social and situational factors, the proposed paradigm can be adapted to evaluate for instance the influence of a supporting group, the use of high or low monetary rewards or how priming disobedience with a documentary influence disobedience. For individual factors, the paradigm allows to investigate how personality traits influence disobedience or to study the neuro-cognitive processes underlying disobedience.

Some novel theories combining a multi-method approach based on social psychology, neuroeconomics and neuroscience could thus emerge to understand better the mechanisms supporting disobedience. For instance, one could evaluate how empathy for the pain of the victim predicts disobedience and how the presence of a supporting group influences our capacity to feel empathy 41 and/or compassion for the ‘victim’ 42 . It could also be argued that the presence of a supporting group diffuses responsibility between individuals and increases obedience, by influencing how our brain processes agency and responsibility over our actions 28 , 43 – 45 . As the results obtained in the present study also indicated that feeling bad for the shocks delivered was statistically associated with prosocial disobedience, one could evaluate how the neural correlates of guilt 46 predicts prosocial disobedience and what historical, cultural and individual factors influence the feeling of guilt.

Six variants of the same task were tested in the present study, some inducing a higher prosocial disobedience rate than others. Statistical results showed that providing a reason—or aim—to justify obedience strongly decreased disobedience. Providing a monetary reward, even one as small as €0.05, also strongly decreased disobedience. Variant 2, in which volunteers were not given an aim or monetary reward, showed the highest disobedience rates. However, to study disobedience in ecological way, the paradigm should capture disobedience of participants even if they know that they are losing something (i.e., monetary rewards or the ‘trust’ of the experimenter asking them help for the study). Defying the orders of an authority generally involves social and/or monetary costs in real-life situations. I would thus not recommend using an experimental paradigm in which volunteers have no costs associated with defying the orders of the experimenter, as it would reduce the ecology of the disobedience act. Variants 3 and 6 involve two types of costs for resisting the orders of the experimenter: a monetary loss and deceiving the experimenter. In Variant 3, descriptive statistics showed that prosocial disobedience was lower compared to Variant 6. The main difference between these two variants was the presence of a free-choice condition. In my former studies 23 , 27 , volunteers frequently justified obedience in the coerced condition because they were given freedom in the free-choice condition (e.g. Participant 89 – English Translation: “ (…) In addition, I knew I could chose freely in the other condition not to send shocks—what I did ). In the present debriefings, some volunteers also reported that the presence of a free-choice condition was giving them enough freedom to accept to follow the orders in the coerced condition. In the supplementary analyses, results showed that when the monetary reward and the aim for obeying are identical, being given a free-choice condition reduces disobedience in the coerced condition. Therefore, Variant 6 appears to provide a good balance between reaching a reliable disobedience rate and finding volunteers who would refuse to produce physical harm on another human beings despite the monetary or social costs associated with defying orders.

Another approach would be to pre-select people who are predicted to be more disobedient. Personality questionnaires indicated that scoring low on the authority and on the purity subscale of the MFQ was strongly associated with a higher prosocial disobedience rate. The link between one’s own relationship to authority and prosocial disobedience observed here replicates another study conducted on the first generation of Rwandese after the 1994 genocide 47 . One’s own relationship to authority thus appears to be a reliable predictor variable in order to pre-select a sample that is more likely to disobey immoral orders.

In the present paper, administering a real mildly painful shock in exchange or not for a small monetary gain was described as an ‘immoral’ act. The notion of what is moral or not can highly differ between individuals 48 , for both academics and volunteers participating in an experiment. Humans are indeed sensitive to different competing issues of morality, a key reason for rescuing persecuted people 49 . In accordance with this observation, the present results indicated that moral reasons were a critical factor associated with the prosocial disobedience rate: the more shocking partners was considered as immoral, the more volunteers disobeyed. However, considering an action as against one’s own moral values does not necessarily translate to a refusal—especially when this order is in line with the Law. An extreme example is soldiers who have perpetrated acts that transgressed their moral beliefs but were issued by their superior in combat 50 . A core question for future research remains: Why are some people capable of putting their own moral standards above the social costs associated with defying orders?

Results indicate that the more volunteers felt responsible during the task, and the worse they felt for sending shocks to the ‘victim’, the higher was their prosocial disobedience. In another study, we observed that obeying orders reduced the feeling of responsibility, how bad and how sorry volunteers felt compared to being free to decide 26 . One hypothesis is that individuals who have preserved a feeling of responsibility and feeling bad—even under command—could more easily defy immoral orders. However, future studies are necessary to better understand the neuro-cognitive processes that prevent an individual from complying with immoral orders. As this paradigm is adapted to neuroimaging measurements, a whole range of studies could now be conducted.

It has been previously suggested that a strong identification with the experimenter giving orders is associated with higher obedience 36 . However, in the present paper, correlations between prosocial disobedience and identification with the experimenter were in favor of H0 with both the frequentist and the Bayesian approaches. In a former study, we also observed that identification to the experimenter was not a critical aspect for explain (dis)obedience. We observed that the generation of Rwandese born after the genocide and tested in Rwanda reported a higher identification to the experimenter than the same generation of Rwandese but tested in Belgium 47 . However, the latter group had a higher prosocial disobedience rate than the former group. Future studies must thus be conducted to understand how the identification with the person giving orders could influence obedience and its weight compared to other social, cultural and individual variables.

Although some volunteers reported that they felt a bit stressed and anxious during the task when they were in the role of the agent, the overwhelming majority did not report any negative psychological feelings. None of the participants withdrew from the experiment and none reported long-term negative psychological effects.

Nowadays, it has become difficult to find volunteers who do not know Milgram’s studies given the high media coverage, including movies, radio soaps, books, podcast and documentaries. One could expect that knowing Milgram would prevent people to obey. However, for the large majority of volunteers, it appears that this is not the case. In previous studies that I conducted with a relatively similar paradigm, the disobedience rate was drastically low (i.e. 3.3%) even if participants were university students knowing Milgram’s studies. In the present study, almost all the volunteers who participated in the present study knew Milgram and explicitly mentioned him during the oral debriefings or before starting the experiment. Yet for those who disobeyed, almost none reported that the reason for disobedience was that they thought it was the aim of the experiment. Further, there was no statistical relationship between prosocial disobedience and believing that it was the aim of the study. It does not mean that knowing Milgram would not influence at all disobedience. It rather suggests that knowing Milgram is not the main factor influencing one’s decision to obey or not an experimenter. It is also possible that since in this experiment shocks were real and not fake such as in Milgram’s studies, participants considered that this was indeed not a study aiming to replicate Milgram.

As far as I have observed, the main problem associated with knowing Milgram’s studies is that volunteers believe that I also have hidden aims and procedures when they enter the experimental room. Several volunteers reported that they only realized that my explanations for the task were true when they were explicitly offered the choice to decide which role to play first and/or when they started receiving the shocks. This is a general concern in psychological studies: The high use of cover stories can also impact other research, as volunteers start to develop a mistrust in what researchers tell them.

Results indicated that who were agents second tend to act similarly as those who were agents first, by sending a relatively similar amount of shocks. Of note, this is an effect that we also observed in past studies on the effect of obeying orders on cognition 23 , 26 , 43 . Nonetheless, in none of those studies we observed that the order of the role had a statistical influence on the neuro-cognitive processes targeted. However, the influence on role reversal on disobedience and related neuro-cognitive processes has still to be investigated in future studies.

The present paradigm is ecological in the sense that volunteers are facing decisions that have a real, physical impact on another human being. However, at the moment I only have little evidences that this paradigm has ecological validity to reflect obedience in real life situations, especially regarding “destructive disobedience” 17 . Caution is indicated when making inference from laboratory studies to complex social behaviours, such as those observed during genocides 16 . My main evidence at the moment is that the very low rate of prosocial disobedience observed in the first generation of post-genocide Rwandans tested in Rwanda using this paradigm 47 is consistent with the fact that deference to authority had already been emphasized by academics as an important factor in the 1994 genocide 4 , 51 . Individual scores on deference to authority in Caspar et al. 47 was the best predictive factor for prosocial disobedience in that former paradigm, thus suggesting some ecological validity. A promising approach would be to recruit “Righteous Among the Nations”, individuals who really saved lives during genocides. Testing this population with the present paradigm would put the ecological validity of this paradigm to the test.

People’s ability to question and resist immoral orders is a fundamental aspect of individual autonomy and of successful societies. As Howard Zinn famously wrote: “ Historically, the most terrible things—war, genocide, and slavery—have resulted not from disobedience, but from obedience ”. Understanding how individuals differ in the extent to which they comply with orders has undeniably several societal implications. They range from understanding how evolving in highly hierarchical environments — such as the military or prisons—influences moral behaviours, to developing interventions that would help to prevent blind obedience and help to resist calls to violence in vulnerable societies. However, since Milgram’s studies, the topic of disobedience has been mostly studied by social psychologists using adapted versions of the initial paradigm developed by Milgram. I hope that with this novel approach, (dis)obedience research will be given a new boost and will be considered by other scientific disciplines seeking to understand better human behaviours.

Supplementary Information

Acknowledgements.

Emilie A. Caspar was funded by the F.R.S-FNRS.

Author contributions

E.A.C. developed the study concept and the method. Testing, data collection and data analysis were performed by E.A.C. E.A.C. wrote the manuscript.

Data availability

Competing interests.

The author declares no competing interests.

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The online version contains supplementary material available at 10.1038/s41598-021-02334-8.

American Psychological Association Logo

Obeying and Resisting Malevolent Orders

Significance.

We did not need Milgram's research to inform us that people have a propensity to obey authority; what it did enlighten us about is the surprising strength of that tendency-that many people are willing to obey destructive orders that conflict with their moral principles and commit acts which they would not carry out on their own initiative. Once people have accepted the right of an authority to direct our actions, Milgram argued, we relinquish responsibility to him or her and allow that person to define for us what is right or wrong.

Practical Application

Milgram's discovery about the unexpectedly powerful human tendency to obey authorities can be applied to real life in several different ways. First, it provides a reference point for certain phenomena that, on the face of it, strain our understanding-thereby, making them more plausible. Clearly, the implications of Milgram's research have been greatest for understanding of the Holocaust. For example, a historian, describing the behavior of a Nazi mobile unit roaming the Polish countryside that killed 38,000 Jews in cold blood at the bidding of their commander, concluded that "many of Milgram's insights find graphic confirmation in the behavior and testimony of the men of Reserve Police Battalion 101."

Second, in his obedience studies, Milgram obtained a rare kind of result-one that people can apply to themselves to change their behavior, or at least to gain greater insight into themselves. Countless people who have learned about the obedience research have been better able to stand up against arbitrary or unjust authority.

Third, the obedience experiments have been widely used in various domains to create broader organizational changes in large segments of society. Some textbooks on business ethics have used those experiments to warn students about the unethical demands that might be made on them by their bosses in the business world. Also, several Supreme Court briefs, as well as over 180 law reviews have referenced them. A frequent argument contained in these sources is that laws requiring police officers to obtain voluntary consent to conduct searches are essentially toothless. Drawing on Milgram's findings, they argue that, given our extreme readiness to obey authority, a person is not very likely to question a police officer's right to search him or his house when he is requested to. Perhaps the most consequential use of the obedience studies by the legal profession was during a South African trial in the late 1980s of 13 defendants accused of murder during mob actions. Expert testimony that obedience to authority and other social-psychological processes were extenuating circumstances, resulted in 9 of the 13 defendants' being spared the death penalty.

A fourth, and final, application of Milgram's research is that it suggests specific preventive actions people can take to resist unwanted pressures from authorities:

Question the authority's legitimacy. We often give too wide a berth to people who project a commanding presence, either by their demeanor or by their mode of dress and follow their orders even in contexts irrelevant to their authority. For example, one study found that wearing a fireman's uniform significantly increased a person's persuasive powers to get a passerby to give change to another person so he could feed a parking meter.

When instructed to carry out an act you find abhorrent, even by a legitimate authority, stop and ask yourself: "Is this something I would do on my own initiative?" The answer may well be "No," because, according to Milgram, moral considerations play a role in acts carried out under one's own steam, but not when they emanate from an authority's commands.

Don't even start to comply with commands you feel even slightly uneasy about. Acquiescence to the commands of an authority that are only mildly objectionable is often, as in Milgram's experiments, the beginning of a step-by-step, escalating process of entrapment. The farther one moves along the continuum of increasingly destructive acts, the harder it is to extract oneself from the commanding authority's grip, because to do so is to confront the fact that the earlier acts of compliance were wrong.

If you are part of a group that has been commanded to carry out immoral actions, find an ally in the group who shares your perceptions and is willing to join you in opposing the objectionable commands. It is tremendously difficult to be a lone dissenter, not only because of the strong human need to belong, but also because-via the process of pluralistic ignorance-the compliance of others makes the action seem acceptable and leads you to question your own negative judgment. In one of Milgram's conditions the naïve subject was one of a 3-person teaching team. The other two were actually confederates who-one after another-refused to continue shocking the victim. Their defiance had a liberating influence on the subjects, so that only 10% of them ended up giving the maximum shock.

Cited Research

Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, Vol. 67, pp. 371-78.

Milgram, S. (1974). Obedience to authority: An experimental view. New York: Harper & Row.

Blass, T (2004). The man who shocked the world: The life and legacy of Stanley Milgram. New York: Basic Books.

Additional Sources

Barrio, A. J. (1997). Rethinking Schneckloth v. Bustamonte: Incorporating obedience theory into the Supreme Court's conception of voluntary consent. University of Illinois Law Review, 1997, pp. 225-251.

Browning, C. (1992). Ordinary men: Reserve Police Battalion 101 and the final solution in Poland. New York: Harper/Collins.

Bushman, B. J. (1984). Perceived symbols of authority and their influence on compliance. Journal of Applied Social Psychology, Vol. 14, pp. 501-508.

Colman, A. M. (1991). Crowd psychology in South African murder trials. American Psychologist, Vol. 46, pp. 1071-1079.

Ferrell, O. C. & Gardiner, G. (1991). In pursuit of ethics: Tough choices in the world of work. Springfield, IL: Smith Collins.

Gray, S. (2004, March 30). Bizarre hoaxes on restaurants trigger lawsuits. The Wall Street Journal, pp. B1-B2.

Modigliani, A. & Rochat, F. (1995). The role of interaction sequences and the timing of resistance in shaping obedience and defiance to authority. Journal of Social Issues, Vol. 51 (3), pp. 107-123.

Poirier, S. & Garlepy, Y. (1996). Compensation in Canada for resolving drug-related problems. Journal of the American Pharmaceutical Association, Vol. 36, pp. 117-122.

The Stanley Milgram Web site .

IMAGES

  1. Milgram experiment

    milgram experiment factors

  2. Stanley Milgram

    milgram experiment factors

  3. Milgram

    milgram experiment factors

  4. Milgram experiment

    milgram experiment factors

  5. Das Milgram-Experiment

    milgram experiment factors

  6. PPT

    milgram experiment factors

VIDEO

  1. Milgram Experiment: Shocking Obedience to Authority Revealed

  2. The Milgram Experiment

  3. Экспериментология 10. Эксперимент Стэнли Милгрема (Stanley Milgram, 1963)

  4. Variation on Milgram's Obedience Study

  5. Obediance to authority experiment

  6. Experimentul Milgram: înţelegerea obedienţei

COMMENTS

  1. Milgram Shock Experiment

    Milgram Experiment Variations. The Milgram experiment was carried out many times whereby Milgram (1965) varied the basic procedure (changed the IV). By doing this Milgram could identify which factors affected obedience (the DV). Obedience was measured by how many participants shocked to the maximum 450 volts (65% in the original study).

  2. Milgram experiment

    Milgram experiment The setup of the "shock generator" equipment for Stanley Milgram's experiment on obedience to authority in the early 1960s. The volunteer teachers were unaware that the shocks they were administering were not real. Milgram included several variants on the original design of the experiment.

  3. Milgram experiment

    Milgram experiment. The experimenter (E) orders the teacher (T), the subject of the experiment, to give what the teacher (T) believes are painful electric shocks to a learner (L), who is actually an actor and confederate. The subject is led to believe that for each wrong answer, the learner was receiving actual electric shocks, though in ...

  4. Milgram Experiment: Overview, History, & Controversy

    The Milgram experiment was an infamous study that looked at obedience to authority. Learn what it revealed and the moral questions it raised. ... According to Milgram, there are some situational factors that can explain such high levels of obedience: The physical presence of an authority figure dramatically increased compliance.

  5. The Milgram Experiment: Theory, Results, & Ethical Issues

    The original and classic Milgram experiment was described by Stanley Milgram in an academic paper he wrote sixty years ago. Milgram was a young, Harvard-trained social psychologist working at Yale University when he initiated the first in a series of very similar experiments. ... He found that at least eight different factors influenced how ...

  6. The Milgram Experiment: Summary, Conclusion, Ethics

    A brief Milgram experiment summary is as follows: In the 1960s, psychologist Stanley Milgram conducted a series of studies on the concepts of obedience and authority. ... Importantly, not all participants obeyed the experimenter's demands, and Milgram's studies shed light on the factors that enable people to stand up to authority. In fact ...

  7. Taking A Closer Look At Milgram's Shocking Obedience Study

    The results of Milgram's experiment made news and contributed a dismaying piece of wisdom to the public at large: It was reported that almost two-thirds of the subjects were capable of delivering ...

  8. Stanley Milgram's Obedience Experiment

    But others who study Milgram's work argue that focusing primarily on physical distance leaves out other important factors suggested by the experiment. Russell and Gregory argue that "emotional distance" is an equally important factor. In their analysis of the Milgram experiments, they write:

  9. Stanley Milgram's Experiment (SOCIAL PSYCHOLOGY ...

    Milgram expanded his initial research into a series of 19 experiments in which he carefully examined the conditions under which obedience would occur. For instance, the teacher's proximity to the learner was an important factor in lowering obedience, that is, the proportion of people willing to deliver the full 450 volts.

  10. Milgram's Experiments on Obedience to Authority

    Summary. Stanley Milgram's experiments on obedience to authority are among the most influential and controversial social scientific studies ever conducted. They remain staples of introductory psychology courses and textbooks, yet their influence reaches far beyond psychology, with myriad other disciplines finding lessons in them.

  11. Milgram & Situational Variables Affecting Obedience

    Chemistry. ChemistryLast Exams 2024SL. Topic Questions. Revision notes on 1.2.2 Milgram & Situational Variables Affecting Obedience for the AQA A Level Psychology syllabus, written by the Psychology experts at Save My Exams.

  12. PDF CommonLit

    The Milgram Experiment. By Saul McLeod 2008. In 1963, Stanley Milgram conducted a study on obedience. Using a series of social psychology experiments, Milgram measured participants' willingness to comply with an authority figure. As you read the text, identify the factors that influenced the behavior of the participants in the study.

  13. Meta-Milgram: An Empirical Synthesis of the Obedience Experiments

    Abstract. Milgram's famous experiment contained 23 small-sample conditions that elicited striking variations in obedient responding. A synthesis of these diverse conditions could clarify the factors that influence obedience in the Milgram paradigm. We assembled data from the 21 conditions (N = 740) in which obedience involved progression to ...

  14. More shocking results: New research replicates Milgram's findings

    But what I found is the same situational factors that affected obedience in Milgram's experiments still operate today." ... In reality, both the authority figure and the learner were in on the real intent of the experiment, and the imposing-looking shock generator machine was a fake. Milgram found that, after hearing the learner's first cries ...

  15. The Stanley Milgram Experiment: Understanding Obedience

    The Stanley Milgram experiment is one of the most famous and controversial studies in the history of psychology. The study was conducted in the early 1960s, and it examined people's willingness to obey an authority figure, even when that obedience caused harm to others.In this article, we'll take a closer look at the Milgram experiment, its significance, and its impact on psychology.

  16. PDF Milgram's Study of Obedience

    Following the experiment, Milgram (1974) interviewed each subject and debriefed them on the true purpose of the study, to alleviate any anxiety upon the end of the experiment. ... 1974). The power of this inhibiting factor must have been greater than the experienced stress, and it is here that Milgram (1963) points to the force that the

  17. Stanley Milgram

    human behaviour. obedience. Stanley Milgram (born August 15, 1933, New York City, New York, U.S.—died December 20, 1984, New York City) was an American social psychologist known for his controversial and groundbreaking experiments on obedience to authority. Milgram's obedience experiments, in addition to other studies that he carried out ...

  18. Credibility and Incredulity in Milgram's Obedience Experiments: A

    Gina Perry is an Australian writer and author of Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments (2012) and The Lost Boys: Inside Muzafer Sherif's Robber Cave Experiment (2018). Both works draw on extensive archival research and interviews with experimental participants. She completed her PhD at the University of Melbourne, where she is an associate ...

  19. Explanations for Obedience

    Deception. Ethical Issues. Right to withdraw. Protection from harm. Milgram (1963) conducted one of the most famous and influential psychological investigations of obedience. He wanted to find out if ordinary American citizens would obey an unjust order from an authority figure and inflict pain on another person because they were instructed to.

  20. 12.4 Conformity, Compliance, and Obedience

    These factors include the participant's age, gender, and socio-cultural background (Bond & Smith, 1996; Larsen, 1990; Walker & Andrade, 1996). Link to Learning. ... Several variations of the original Milgram experiment were conducted to test the boundaries of obedience. When certain features of the situation were changed, participants were ...

  21. Stanley Milgram

    Stanley Milgram left Harvard in 1967 to return to his hometown, New York City, accepting a position as head of the social psychology program at the Graduate Center of the City University of New York. Tragically, he died of a heart attack at the age of 51. Milgram is listed as number 46 on the American Psychological Association's list of the ...

  22. A novel experimental approach to study disobedience to authority

    The experiment of Stanley Milgram is one of the most (in)famous in psychology 1, within and beyond academia. Several variables account for this notoriety, such as the method used, the ethical issues associated, the enthralling results or the societal impact of the research topic. ... To understand the factors that prevent an individual from ...

  23. Obeying and Resisting Malevolent Orders

    A fourth, and final, application of Milgram's research is that it suggests specific preventive actions people can take to resist unwanted pressures from authorities: Question the authority's legitimacy. We often give too wide a berth to people who project a commanding presence, either by their demeanor or by their mode of dress and follow their ...