Untangling Pop Culture’s Obsession with the Milgram Experiment

Photo: Millgram Experiment participant. Via Pacific Standard.
Photo: Milgram Experiment participant. Via Pacific Standard.

The Milgram Experiment, which supposedly shows that all human beings are capable of participating in torture under the watchful eye of an authority figure, has captivated popular culture for half a century. Why is that, given that there are finer social science studies out there? This post describes the experiment as well as another famous psychology experiment, the Stanford Prisoner Experiment. I critique these studies as well as exploring the public’s fascination with them, despite their methodological flaws. I provide a case study of how popular culture reproduces the Milgram Experiment as a universal “truth” about humanity’s innate propensity towards “evil.” The truth is that the Milgram Experiment is highly flawed and it tell us very little about our genetic predisposition for torture. What the Milgram Experiment does show, however, is that storytelling falls back on simplistic narrative about good and evil. Social science, in this case psychology and neuroscience, is just another plot device to reproduce the basic notion that “good people” can be made to do “bad things.” The social reality is much more complex and disturbing because it forces us to re-examine the relationship between obedience, culture and social interaction.

The Milgram Experiment

Image: Drawing of Millgram Experiment. Via Wikimedia
Image: Drawing of Milgram Experiment. Via Wikimedia

Stanley Milgram (1933-1984) was a psychologist who set up an experiment in the 1960s where people dressed in lab coats asked participants to administer electric shocks to other people who were in another room. While there was, in fact, no other person receiving these shocks, the participants were not privy to this information. Nevertheless, most people complied with the orders (after some encouragement). The study argues that most people followed these instructions because human beings have been trained not to question people who are perceived to have a higher authority. Supposedly, the Milgram experiment shows that, given the “right conditions”, most “good people” do as they are told, even if they are instructed to do something “evil”. You can see a BBC replication of these experiments below.

Milgram’s study is closely affiliated with Philip Zimbardo’s Stanford Prisoner Experiment, where one group of university students was asked to play the role of prison guards and another group of students played the part of prisoners. The people playing the guards subsequently abused their position of authority.

Even after five decades of controversy and criticism, both the Milgram and Prisoner experiments retain strong cultural authority by offering a similar overarching explanation about the “evil nature” of humanity. These experiments are used in various professional fields such as in management training and they have been used to explain the Holocaust and Nazi war crimes as well as human rights abuses in Abu Ghraib. In 1979, Milgram said on the American 60 Minutes:

I would say, on the basis of having observed a thousand people in the experiment and having my own intuition shaped and informed by these experiments, that if a system of death camps were set up in the United States of the sort we had seen in Nazi Germany, one would be able to find sufficient personnel for those camps in any medium-sized American town.

American school teacher Jane Elliot‘s blue eyed/brown eyed exercise in A Class Divided is based on a role-playing exercise with a similar premise: people are not born racist, they are socially conditioned to discriminate. Elliot’s teaching exercise is better managed as it is designed to teach children about discrimination.

The Milgram and Prisoner experiments have permeated popular culture. This should be a coup for social science, as it means our ideas diffuse into the public. Unfortunately, this development has not been ideal as the research has been twisted and lost its potential for educating the public on the nuances of human behaviour.

Pop Psychology Gone Wrong

Horror auteur Eli Roth re-staged the Milgram Experiment in 2011 for a Discovery Channel special titled How Evil Are You? (below).  Roth’s documentary inspired me to write this post for two reasons. First because as a horror film connoisseur, I love some of his movies like the delicious Cabin Fever (though detest others like Hostel). Second, this special has all the loathsome characteristics of sciencetainment (fluff entertainment posing as science): it recreates the experiment both as an amusing fact, but without serious engagement with the study and its methods. It also conveniently sidesteps five decades of critique.

Neuroscientist Jim Fallon who has made a career of popular writing on serial killers, appears on the show. Fallon administers a “DNA and blood test” to determine whether Roth has “the evil gene.” Note that there is no such thing as an evil gene. Fallon is filmed showing Roth “brain scans” (fMRI images) that are played to dramatic effect. Fallon says they show that Roth’s brain is activated when viewing neutral pictures, which he says is “positive” as it supposedly demonstrates Roth is capable of empathy. Yet when he viewed images of horror, Fallon says “that empathetic part is turned off… almost like a temporary psychopath.” This phrase is also misleading, as there is no such thing as a temporary psychopath. The DSMV V, the official manual used in psychiatry and psychology, recognises six personality disorders (defined as a “pattern of impairments and traits“). Psychopathy is not one of them. Its closest application is the Antisocial type, which is about taking advantage of others and aggression – but not specifically about killing and most certainly not about an evil gene.

So: even though Roth explains that he was physically unable to watch horror films as a child without “projectile vomiting” and over the years he developed the ability to not only watch – but direct – horror films (that is: works of fiction)… Fallon still acts as if the brain scans prove a genetic disposition towards “evil.” Never mind that brain imaging does not show a correlation between emotional, cognitive and genetic predisposition to violence. Brain imaging is a useful tool for mapping areas of the brain in response to stimuli, but there are problems with overstating what these images mean. Even a dead salmon can show activity in an fMRI scan.* Brain scans are seductive because they are visually compelling, but the scientific community has been careful about how we interpret these images. “Brain scans are not pictures of cognitive processes in the brain in action.” As a neuroscientist, Fallon should know better than to stage this spectacle.

Roth, who is of Jewish descent, says he is interested in the Milgram experiment as an explanation for the Holocaust. Ultimately as he says on his show, he seems to believe that anyone is capable of “evil”:

There’s an even more frightening possibility – what if anyone – you, me, your next door neighbour – has the capacity for evil? What if all of us, under the right circumstances, are capable of committing the most horrible acts?”

This is one of many pop culture representations of the Milgram experiment. Known as the “just following orders” trope, a version of the experiment is found in various works of fiction, from books such as The Reader and in films such as The Experiment starring Adrien Brody and Forest Whitaker.

Never mind that the Milgram Experiment was not about proving or disproving “evil.” Rather, it was focused on how people respond to authority figures during a lab experiment.

The Critique

The study has been critiqued heavily over the past four decades on the grounds of research ethics. Milgram used deception and potentially scarred the participants. The study has also been criticised for experimental bias. It does not measure how people behave in real life, but rather how they respond to commands in a staged setting. The set up of the original experiment was seen as absurd by some of the participants and so they experienced no qualms about complying with the instructions.

Advertisement recruiting participants to the original Millgram Experiment. Via Sociological Images
Advertisement recruiting participants to the original Milgram Experiment. Via Sociological Images

The study was set up to prove the ideology of the scientist; that is, it proved what Milgram already believed about human nature. The study was also not representative of a cross-section of society – it initially recruited White men who were paid to participate in a “study about memory” (see the recruitment notice on the right). Payment is not uncommon in science studies, to compensate for the participants’ time, but studies do have to be purposefully recruiting a specific segment of a sub-population, in which case it is not representative of all society. Otherwise, the sample should be random. Milgram’s was the former so it is not representative of all Americans, let alone all of humanity.

As Australian social scientist John Laurent shows, Milgram’s experiment is an excellent example of the social construction of science. The Milgram Experiment continues to be presented as psychological “truth” about the awful side of human nature, particularly in first-year undergraduate textbooks. Yet Laurent argues that Milgram’s research is a historical product borne of a time when B. F. Skinner‘s theories of behavourism prevailed and psychologists were less interested in individuals’ subjective understandings of the experiments for which they had volunteered. That is, the experiment was set up to elicit a specific response within a lab without seeking to understand how and why participants complied with the instructions. Psychologist Ian Nicholson argues that the study is best understood as a historical performance of Cold War American masculinity.

Subsequent studies claim to replicate the original findings. For example, see this review in A Backstage Sociologist from 2008 and The Situationist from 2007 and 2011. One study has attempted to address the limitations of the original study, by controlling for experimental bias. The study used 31 French undergraduate students (29 women) who were told the shocks they were administering were not real, but in some cases, they could see a “victim” hooked up to a machine, and they could hear and see them reacting to the shocks. This study finds that people were less distressed about administering shocks to people of a different racial background – namely North Africans . In this case, pre-existing prejudices enabled the participants to de-humanise the “victims.” Conversely, participants experienced greater discomfort and a lower desire to comply with administering shocks when they were told, and they could see, that the person who was receiving the shocks were of the same racial background as the participant (French). The participants were more likely to exhibit a level of mental anguish or depression two months after the experiments in relation to the French victims, but not so for the African victims.

Levels of compliance differed when the participants could see the “victim” receiving the (simulated) shocks. One third of the participants administered the shocks to the full capacity as instructed; another 35% administered only to the level where the victim was in visible pain and asked for the experiment to stop; and the rest continued a bit further, to when the victim became silent. The people who fully complied with the orders and administered higher shocks had pre-existing anger issues. They were also more likely to blame the researchers and victims when they were asked to explain why they complied with orders. For the rest, being able to see and hear the victims respond in pain resulted in distress in the participants, which is why they stopped applying the “shocks” even though they knew the shocks weren’t real. The researchers write:

it is interesting to observe that despite the fact that all participants knew for sure that nothing was real, they tended to respond to the situation as if it was real.

So you see, context, culture and social relations all play a role in people’s willingness to obey orders unquestionably. In fact, not everyone is equally likely to comply with orders. The situation, interaction and affinity with victims mediate compliance. It isn’t as simple as people automatically being predisposed to follow orders. It isn’t really even about some people being “good” and others being “evil.” Personal experience, emotion, rationalisation and detachment from Others matter.

Pop Culture’s Fascination With Innate “Evil”

I am intrigued with how science is portrayed in popular culture, especially social science. I see that there is sociological merit in understanding how scientific theories take hold of mass culture. The Milgram experiment tells us very little about the limits of our universal humanity. Instead, it tells us something about how people behave in response to researchers in a lab. Sociologists generally do not ascribe to any universal truths about “human nature”. The cultural fascination and reproduction of Milgram’s research in popular culture actually tells us something profound about the mainstream “Western” collective mindset that reduces the idea of human ethics and decision-making into easily digestible one-line discourses. Phrases such as “all human beings are inherently good or evil” or “most good people do bad things because other people tell them to” reverberate in many books, movies and TV shows.

Pop culture’s fascination with the Milgram experiment is ultimately that it provides a simplistic good-versus-evil narrative about human behaviour that otherwise horrifies and confuses us. The idea that people in lab coats – those in positions of authority – can coerce us into transgressing human decency is reassuring in its simplicity. We can then imagine that we are above such amoral behaviour if we were in a similar position. We can also absolve ourselves of thinking through how our own cultural biases influence our willingness to follow orders without taking personal responsibility. The fact is that no singular factor influences human obedience. Culture, time, place, social norms and other social factors impact how we act in times of duress, and not all human beings will react in the same way.

The Milgram experiment might provide some solace with its neat explanation of how authority corrupts, but the reality is much more complex.

Connect With Me

Follow me @OtherSociology or click below!

Other Sociologist TwitterOther Sociologist FacebookOther Sociologist Google+Other Sociologist InstagramOther Sociologist TumblrOther Sociologist Pinterest


This is an elongated version of my post was originally published on Tumblr.

* Hat Tip: Doctor Tommy Leung for the research link to the salmon fMRI study.

9 thoughts on “Untangling Pop Culture’s Obsession with the Milgram Experiment

  1. Greatly appreciate the nuanced approach. I’ve often wondered about the Milgram Experiment and in my mind also drawn the wrong conclusions. I can make you overly pessimistic about the expected results of certain political or charitable initiatives.


  2. This article is focused on a strawman. Milgram never claimed that his experiments proved that humans had an “evil” nature. Rather he claimed that they show that humans will follow the directions of an authority even if doing so causes harm to other people. The author’s arguments against a misinterpretation misattributed to Milgram are thus irrelevant.


    1. Hi Max,

      Not sure that you understand the concept of a “strawman.” Milgram’s original study begins by making a link between obedience in Nazi Germany, and what he sees as the innate aspect of behaviour to follow orders, irrespective of learned morality (“deeply ingranted behavioural tendency”). In his discussion of his results, Milgram argues that conformity to authority overrides individual ethics. My article is about how Milgram’s work has been misused to make universal claims about “good” and “evil.” Might be useful to re-read my article and the original linked source.


Comments are closed.