Fri December 19, 2008
By Elizabeth Landau
Stanley Milgram began conducting his famous psychology experiments in 1961.
(CNN) -- If someone told you to press a button to deliver a 450-volt electrical shock to an innocent person in the next room, would you do it?
Common sense may say no, but decades of research suggests otherwise.
In the early 1960s, a young psychologist at Yale began what became one of the most widely recognized experiments in his field. In the first series, he found that about two-thirds of subjects were willing to inflict what they believed were increasingly painful shocks on an innocent person when the experimenter told them to do so, even when the victim screamed and pleaded.
The legacy of Stanley Milgram, who died 24 years ago on December 20, reaches far beyond that initial round of experiments. Researchers have been working on the questions he posed for decades, and have not settled on a brighter vision of human obedience.
A new study to be published in the January issue of American Psychologist confirmed these results in an experiment that mimics many of Milgram's original conditions. This and other studies have corroborated the startling conclusion that the majority of people, when placed in certain kinds of situations, will follow orders, even if those orders entail harming another person.
"It's situations that make ordinary people into evil monsters, and it's situations that make ordinary people into heroes," said Philip Zimbardo, professor emeritus of psychology at Stanford University and author of "The Lucifer Effect: Understanding How Good People Turn Evil."
How Milgram's experiments worked
Milgram, who also came up with the theory behind "six degrees of separation" -- the idea that everyone is connected to everyone else through a small number of acquaintances -- set out to figure out why people would turn against their own neighbors in circumstances such as Nazi-occupied Europe. Referring to Nazi leader Adolf Eichmann, Milgram wrote in 1974, "Could it be that Eichmann and his million accomplices in the Holocaust were just following orders? Could we call them all accomplices?"
His experiment in its standard form included a fake shock machine, a "teacher," a "learner" and an experimenter in a laboratory setting. The participant was told that he or she had to teach the student to memorize a pair of words, and the punishment for a wrong answer was a shock from the machine.
The teacher sat in front of the shock machine, which had 30 levers, each corresponding to an additional 15 volts. With each mistake the student made, the teacher had to pull the next lever to deliver a more painful punishment.
While the machine didn't generate shocks and a recorded voice track simulated painful reactions, the teacher was led to believe that he or she was shocking a student, who screamed and asked to leave at higher voltages, and eventually fell silent.
If the teacher questioned continuing as instructed, the experimenter simply said, "The experiment requires that you go on," said Thomas Blass, author of the biography "The Man Who Shocked The World: The Life and Legacy of Stanley Milgram" and the Web site StanleyMilgram.com.
About 65 percent of participants pulled levers corresponding to the maximum voltage -- 450 volts -- in spite of the screams of agony from the learner.
"What the experiment shows is that the person whose authority I consider to be legitimate, that he has a right to tell me what to do and therefore I have obligation to follow his orders, that person could make me, make most people, act contrary to their conscience," Blass said.
An update
Because of revised ethical standards for human subject research, this kind of experiment cannot be replicated exactly. But Jerry Burger, professor of psychology at Santa Clara University in Santa Clara, California, made some tweaks to see if Milgram's results hold up today.
His study's design imitated Milgram's, even using the same scripts for the experimenter and suffering learner, but the key difference was that this experiment stopped at 150 volts -- when the learner starts asking to leave. In Milgram's experiment, 79 percent of participants who got to that point went all the way to the maximum shock, he said.
To eliminate bias from the fame of Milgram's experiment, Burger ruled out anyone who had taken two or more college-level psychology classes, and anyone who expressed familiarity with it in the debriefing. The "teachers" in this recent experiment, conducted in 2006, also received several reminders that they could quit whenever they wanted, unlike in Milgram's study.
The new results correlate well with Milgram's: 70 percent of the 40 participants were willing to continue after 150 volts, compared with 82.5 percent in Milgram's study -- a difference that is not statistically significant, Burger said.
Still, some psychologists quoted in the same issue of American Psychologist questioned how comparable this study is to Milgram's, given the differences in methods.
The idea of blind obedience isn't as important in these studies as the larger message about the power of the situation, Burger said. It's also significant that the participant begins with small voltages that increase in small doses over time.
"It's that gradual incremental nature that, as we know, is a very powerful way to change attitudes and behaviors," he said.
Stanford Prison Experiment
This idea of circumstances driving immoral behavior also came out in the Stanford Prison Experiment, a study done in 1971 that is the subject of a film in preproduction, written and directed by Christopher McQuarrie. Work on the film will resume in 2009 after McQuarrie's "Valkyrie" is released, his spokesperson said.
In this study, designed by Stanford's Zimbardo, two dozen male college students were randomly designated as either prison guards or prisoners, and lived in the basement of the university's psychology building playing these roles in their respective uniforms.
Within three days, participants had extreme stress reactions, Zimbardo said. The guards became abusive to the prisoners -- sexually taunting them, asking them to strip naked and demanding that they clean toilet bowls with their bare hands, Zimbardo said. Five prisoners had to be released before the study was over.
Zimbardo's own role illustrated his point: Because he took on the role of prison administrator, he became so engrossed in the jail system that he didn't stop the experiment as soon as this cruelty began, he said.
"If I were simply the principal experimenter, I would have ended it after the second kid broke down," he said. "We all did bad things in this study, including me, but it's diagnostic of the power situation."
Turning the principle around
But while ordinary people have the potential to do evil, they also have the power to do good. That's the subject of the Everyday Heroism project, a collection of social scientists, including Zimbardo, seeking to understand heroic activity -- an area in which almost no research has been done, he said.
Acts such as learning first aid, leading others to the exit in an emergency and encouraging family members to recycle are some heroic behaviors that Zimbardo seeks to encourage.
"Most heroes are everyday people who do a heroic deed once in their lifetime because they have to be in a situation of evil or danger," he said.