Questioning the banality of evil

Moderators: Elvis, DrVolin, Jeff

Questioning the banality of evil

Postby nomo » Thu Jan 03, 2008 2:44 pm

PDF at link: http://www.thepsychologist.org.uk/archi ... cleID=1291

Volume 21 - Part 1 - (January 2008)
Questioning the banality of evil

S. Alexander Haslam and Stephen D. Reicher re-examine the established view, in an article based on the 2007 Argyle Lecture

Pages: 16-19

Us and them
And, after all, we’re only ordinary men.
Me and you
God only knows it’s not what we would choose to do.

Pink Floyd, Us and Them



It is relatively rare that the ideas in psychology texts become so well known that they influence popular culture. However, as the above lyrics attest, one such idea maintains that if you put ordinary decent people in groups and create a division between ‘us’ and ‘them’ then they will descend mindlessly into brutality, to the extent that they might even be prepared to commit mass murder. And in a world where the brutality of groups is as apparent as ever, this idea continues to have widespread appeal.

A search for the roots of this position takes us on a well-worn trail through many of the most famous psychological studies ever conducted. It starts with the work of Solomon Asch, which has been understood to show that fellow group members can influence people to deny the evidence of their own eyes so that they mismatch lines of clearly different length. It continues through the obedience studies of Stanley Milgram (1974), in which well-adjusted men participating in a bogus memory experiment proved willing to deliver electric shocks of murderous magnitude to another person who posed as a ‘learner’. It culminates with Philip Zimbardo’s Stanford Prison Experiment. Here college students who were randomly assigned to be guards in a simulated prison adopted their roles with such brutality and vigour that the study had to be halted before it was half-way through.

But the evidence is not just psychological. At the very same time as Milgram was running his studies in the United States, the influential political theorist Hannah Arendt was in a courtroom in Jerusalem watching the trial of Adolf Eichmann — one of the chief architects of the Endlösung der Judenfrage (the ‘final solution to the Jewish question’). In many ways, then, Eichmann should be the personification of evil – akin to the monster Saturn depicted devouring his own son on the cover of this issue. Yet at his trial what struck Arendt was not how evil he appeared but, on the contrary, how utterly normal he was. He came across as a bland, passionless, simple man. This, for her, was the truly frightening thing, because it meant that Eichmann could not be dismissed as mad or as different from the rest of us. In Arendt’s (1963) words, the lesson of the trial was that of the ‘fearsome, word-and-thought-defying banality of evil’.

This view that ordinary people can do monstrous things derives strength neither from psychology alone nor from history alone, but from the convergence between the two. And that convergence extends beyond identification of the phenomenon, to the way in which it is explained. According to Arendt, Eichmann and his fellow bureaucrats became obsessed with the technical details of genocide (e.g. timetabling transport to the death camps) and, in so doing, they lost sight of the larger picture. They had no awareness that their acts were wrong. They simply followed orders mechanically, unimaginatively, unquestioningly.

When Milgram sought to make sense of what had happened in his own obedience studies, he explicitly adopted this explanation, noting that ‘Arendt’s conception of the banality of evil comes closer to the truth than one might dare to imagine’ (1974, p.23). In his own writings, though, he translated Arendt’s ideas into the concept of an ‘agentic state’ in which people suspend their capacity to make informed moral judgments and relinquish responsibility for what they do to those in authority. Regardless of what it is that they are being asked to do, once in an agentic state, the person’s sole concern becomes how well they do the bidding of these authorities.

These ideas were later taken even further by Zimbardo. He argued that the sense of obligation and duty to which Milgram referred was not dependent on the presence of strong authority figures. Instead, he suggested that people can be led to perpetrate atrocities not because they blindly follow orders, but because they conform blindly to what is expected of them as a group member. Thus, in the specific case of the Stanford study, Zimbardo and colleagues argued that ‘acts of guard aggression were emitted simply as a “natural” consequence of being in the uniform of a “guard” and asserting the power inherent in that role’ (Haney et al., 1973). The message here is that even the most thoughtful and humane individual will become a brutal zombie if put in the wrong sort of group. Accordingly, tyranny is not something we have control over or responsibility for. It is not only ‘natural’, in many situations it is inevitable.

To bring things full circle, just as Milgram based his analysis on Arendt’s historical observations, so the historian Christopher Browning (1992) later used Zimbardo’s psychological explanations to explain historical evidence related to the activities of Reserve Police Batallion 101, a Nazi killing unit that murdered around 40,000 Polish Jews at the height of WWII. Browning argues that the members of this unit were just ‘ordinary men’ (the title of his book) and that the situation in 1940s Poland, together with the role expectations placed upon Nazi battalions, was enough to make mass murderers of them – just as, in Stanford, ‘the prison system alone was a sufficient condition to produce aberrant, anti-social behaviour’ (Browning, 1992, p.168). Browning ends his book with the disturbing question: ‘If the men of Reserve Police Batallion 101 could become killers under such circumstances, what group of men cannot?’ (p.189).

The ingenuity of evil

Until recently, there has been a clear consensus amongst social psychologists, historians and philosophers that everyone succumbs to the power of the group and hence no one can resist evil once in its midst. But now, suddenly, things don’t seem quite so certain. On the historical side, a number of new studies – notably David Cesarani’s (2004) meticulous examination of Eichmann’s life and crimes – have suggested that Arendt’s analysis was, at best, naive. Not least, this was because she only attended the start of his trial. In this, Eichmann worked hard to undermine the charge that he was a dangerous fanatic by presenting himself as an inoffensive pen-pusher. Arendt then left. Had she stayed, though, she (and we) would have discovered a very different Eichmann: a man who identified strongly with anti-semitism and Nazi ideology; a man who did not simply follow orders but who pioneered creative new policies; a man who was well aware of what he was doing and was proud of his murderous ‘achievements’.

A spate of books have made similar arguments about the psychology of Nazi functionaries in general (see Haslam & Reicher, 2007a, for a review). They all suggest that very few Nazis could be seen as ‘simply following orders’ – not least because the orders issued by the Nazi hierarchy were typically very vague. As a result, individuals needed to display imagination and initiative in order to interpret the commands they were given and to act upon them. As Ian Kershaw notes, Nazis didn’t obey Hitler, they worked towards him, seeking to surpass each other in their efforts. But by the same token, they also had a large degree of discretion. Indeed, as Laurence Rees (2005) notes in his recent book on Auschwitz and the ‘final solution’, it was this that made the Nazi system so dynamic. Even in the most brutal of circumstances, people did not have to kill and only some chose to do so. So, far from simply ‘finding themselves’ in inhumane situations or inhumane groups, the murderers actively committed themselves to such groups. They actively created inhumane situations and placed themselves at their epicentre. This was true even of concentration camp regimes:

Individuals demonstrated commitment by acting, on their own initiative, with greater brutality than their orders called for. Thus excess did not spring from mechanical obedience. On the contrary; its matrix was a group structure where it was expected that members exceed the limits of normal violence.

(Sofsky, 1993, p.228)

In short, the true horror of Eichmann and his like is not that their actions were blind. On the contrary, it is that they saw clearly what they did, and believed it to be the right thing to do.

But even if Hitler’s killers were not the mindless functionaries of fable, doesn’t the work of Milgram and Zimbardo still show that ‘ordinary men’ can become brutal by becoming mindless under the influence of leaders and groups? Not really. For if the studies of Milgram and Zimbardo are subjected to the same close critical scrutiny that has transformed Holocaust scholarship, their explanations are also found wanting. In arguing this, we are not questioning the fact that both studies are of great importance in showing that ordinary people can do extreme things. The issue, rather, is why they do them. In Milgram’s case there are three key problems with his ‘agentic state’ account. First, there is no relationship between the extent to which people cede responsibility to experimenters and the extent to which they obey them. It is not true that people obey because they have put themselves in the hands of an authority figure. Second, if one listens to the conversations that occurred between participants and experimenters, it is clear that people are not concerned simply with how well they are ‘following orders’. Rather they are very aware of the shocks they are inflicting on ‘learners’ and they wrestle with their consciences in seeking to determine whether what they are doing is morally justifiable in general terms or even noble in terms of assuring scientific progress.

Third, there is considerable variation in the level of obedience displayed in different variations of the study which are hard to explain in Milgram’s own terms. For instance, the study was conducted both in prestigious Yale and down-market Bridgeport. One might expect the relative authority of the experimenter to be greater in the less privileged area, thus leading to more of an agentic state and hence more compliance. Yet obedience was actually lower in Bridgeport than Yale. An alternative approach is to suggest that participants were less likely to identify with experimenters in Bridgeport and hence less likely to take on board their scientific priorities (relative to the welfare needs of fellow participants). This suggests that whether we listen to authorities or support victims depends upon the extent to which we perceive ourselves to share social identification with them (Turner, 1991).

There are even stronger grounds for doubting the received wisdom in Zimbardo’s case. As we have already outlined, the Stanford study is often used to argue that researchers do not have to go to the lengths that Milgram did in order to elicit brutality. Brutal roles will suffice even in the absence of brutal leaders. But the problem with this argument is that, in reality, Zimbardo went further than Milgram in encouraging his participants to abuse others (Banyard, 2007). In briefing guards at the very start of his
study, he told them:

We can create in the prisoners feelings of boredom, a sense of fear to some degree, we can create a notion of arbitrariness that their life is totally controlled by us, by the system, you, me… They’ll have no freedom of action, they can do nothing, or say nothing that we don’t permit. We’re going to take away their individuality in various ways. In general what all this leads to is a sense of powerlessness.
(Zimbardo, 2007, p.55)

What is striking here is the way in which Zimbardo both defines the guards and himself as part of a common group (‘we’) and then defines the tyrannical ways in which all should behave. That is, not only is he the source of malevolent leadership (like Milgram’s experimenter), but he also actively encourages the guards to identify with his leadership.

Even so, not all the guards went along with him. Zimbardo notes how some sided with the prisoners, some were strict but fair, and only a minority became truly brutal – notably one guard dubbed ‘John Wayne’ on account of his arrogant swagger. After the study, ‘John Wayne’ explained his actions, and it is apparent that he identified so fully with Zimbardo’s leadership that he fancied himself an experimenter in his own right – using his creativity and imagination to invent new humiliations and pushing people ever further to see how far they would go before they snapped (Zimbardo, 2007).

So from Stanford, as from the obedience studies, it is not valid to conclude that people mindlessly and helplessly succumb to brutality. Rather both studies (and also the historical evidence) suggest that brutality occurs when people identify strongly with groups that have a brutal ideology. This leads them to advance that ideology knowingly, creatively and even proudly. The question we need to address then is ‘What leads people to create and maintain such social identifications?’ We suggest there are three parts to the answer.

1) Individual differences

In a simple but powerful study, Carnaghan and McFarland (2007) placed two adverts in a newspaper. The first advert was an invitation for individuals to participate in a standard psychological experiment. The second followed the wording of the original advert for Zimbardo’s Stanford study — calling for people to participate ‘in a psychological study of prison life’. Those who responded to the second advert were very different from those who responded to the first. They were much more likely to believe in the harsh and hierarchical world that exists in prison. This finding suggests that, where there is a free choice, not just anyone would elect to put themselves in a ‘prison’ situation and take on a ‘prison’ role.

The simplest way of explaining such choices would be to put them down to personality, level of authoritarianism, social dominance, or some other such individual factor. However, our own prison study (conducted in collaboration with the BBC; Reicher & Haslam, 2006, and www.bps.org.uk/pris) suggests a more nuanced explanation. Here (as in Zimbardo’s study) several of those assigned to be guards refused to embrace this role. The primary issue for these individuals was how an enthusiastic embrace of the guard group membership would impact upon their other valued group memberships. Would tyrannical behaviour undermine their social identities at home, at work, at leisure? This suggests that people will be less likely to identify with groups with tyrannical norms the more that their membership of groups with different norms is salient and the more that they are made accountable to those alternative groups.


2) Contexts of crisis and group failure

It may be that there are certain people who, in any given context, are more likely to identify with tyrannical and brutal groups, but equally there are some contexts which make everyone more likely to accept such groups. Perhaps the most surprising finding from the BBC Prison Study was its demonstration of the way in which our participants, who started off holding democratic views and opposing inequality, gradually became more authoritarian as their groups failed to function effectively and the overall system fell into chaos. In such situations, the notion of a strong leader who would forcibly – even brutally – impose and maintain order became, if not actually attractive, at least less unattractive (Haslam & Reicher, 2007b). What we saw here, then, was that authoritarianism – often seen as the key personality variable that explains the dynamics of tyranny – was itself changed as a function of social dynamics.

There are strong parallels here with historical studies of the context in which the Nazis ascended to power (e.g. Hobsbawm, 1995). The Weimar Republic, which preceded Nazi rule, was riven between democrats and those who dreamed of a strong domineering leader. As the republic fell into economic and political crisis, so the middle classes deserted democracy and embraced Hitler as the man who would save them. This process is encapsulated in the words of a school teacher who, writing in 1934, explained why he had joined the Nazis:

I reached the conclusion that no party but a single man alone could save Germany. This opinion was shared by others, for when the cornerstone of a monument was laid in my hometown, the following words were inscribed on it: ‘Descendants who read these words, know ye that we eagerly await the coming of the man whose strong hand may restore order’.

(quoted in Abel, 1986, p.151)

Theoretical and practically, the dynamics through which such views emerge point to ways in which standard personality-based accounts of tyranny need to be radically rethought.

3) Leadership

Whatever is going on in the world, however great the crisis, it is still necessary for people to make sense of events, to explain how current difficulties came about and to have a vision of how they can be resolved. But we do not interpret the world on our own, as many social psychological models tend to imply. Rather, people are surrounded by would-be leaders who tell them what to make of the world around them. For this reason, the study of leadership must be a central component of any analysis of tyranny and outgroup hostility. Indeed, tyrannical leaders only thrive by convincing us that we are in crisis, that we face threat and that we need their strong decisive action to surmount it. In the BBC study, participants as a whole may have become relatively more authoritarian, but it still needed active leadership to exploit this and to make the case for a new tough regime.The role of leaders becomes particularly pernicious when they suggest that ‘our’ problems come about because of the threats posed by a pernicious outgroup. In this way they can begin to take the groups with which we already identify and develop norms of hostility against outsiders. Their role becomes even more dangerous when they tell us that ‘we’ are the sum of all virtues so that the defence of virtue requires the destruction of the outgroup that threatens us. These are the conditions which allow groups to make genocide normative and to represent mass murder as something honourable (Reicher et al., 2006). It was the logic to which Eichmann subscribed when, after the end of the war, he said: ‘If, of the 10.3 million Jews…we had killed 10.3 million, then I would be satisfied. I would say “All right. We have exterminated an enemy”’ (quoted in Cesarani, 2004, p.219).

Changing the mantra

Until recently, psychologists and historians have agreed that ordinary people commit evil when, under the influence of leaders and groups, they become blind to the consequences of their actions. This consensus has become so strong that it is repeated, almost as a mantra, in psychology textbooks and in society at large. However critical scrutiny of both historical and psychological evidence – along with a number of new studies, e.g. Krueger (in press); Staub (in press) – has produced a radically different picture. People do great wrong, not because they are unaware of what they are doing but because they consider it to be right. This is possible because they actively identify with groups whose ideology justifies and condones the oppression and destruction of others.
As we have suggested, this raises a whole set of new questions: Who identifies with such groups? When does identification become more likely? How do genocidal ideologies develop? What is the role of leaders in shaping group ideology? We do not pretend to have a full set of answers to these questions. But we do insist that, unless one asks the right questions, any answers will be of little use.

Our complaint against the old consensus is that, for far too long, it has asked the wrong questions and led us to seek the key to human malevolence in the wrong place. Cesarini’s study of Eichmann led him to conclude that: ‘the notion of the banality of evil, combined with Milgram’s theses on the predilection for obedience to authority, straitjacketed research for two decades’ (2004, p.15). We agree. As John Turner (2006) argues, it is time to escape our theoretical prisons.

BOX: Ordinary women?

When we think of torturers, tyrants and their lackeys, we tend to think of men. And it is certainly true that most of the psychological studies of conformity, obedience and inhumanity have only involved male participants. However, history shows that women can be every bit as inhumane as men. One need only think of Irma Grese, the most brutal of the 200 female SS guards at Auschwitz or Ilse Koch, the head female guard at Buchenwald. As with their male counterparts, research by Claus Christensen (2006) suggests that the female SS killers did not have extraordinary backgrounds but were ‘ordinary women’ who, for a range of reasons, became highly identified with the ideology and goals of the Nazi regime.

This is not to say that gender is irrelevant to the issue of tyranny. For numerous women, the content and norms of their gender identity would be strongly at odds with any form of hierarchy or inequality. Yet for many women, gender was integral to the appeal of Hitler, a powerful patriarchal leader. So, rather than making general statements about how sex and gender might relate to tyranny, we need to examine how different definitions of gender identity may be more or less compatible with authoritarianism and hence facilitate
or impede identification with authoritarian groups.

S. Alexander Haslam is Professor of Social Psychology at the University of Exeter
a.haslam@exeter.ac.uk

Stephen D. Reicher is Professor of Social Psychology and Head of the School of Psychology at the University of St Andrews
sdr@st-andrews.ac.uk

References

Abel, T. (1986). Why Hitler came to power. Cambridge, MA: Harvard University Press.
Arendt, H. (1963). Eichmann in Jerusalem: A report on the banality of evil. New York: Penguin.
Banyard, P. (2007). Tyranny and the tyrant. The Psychologist, 20(8), 494–
495. See www.thepsychologist.org.uk.
Browning, C. (1992). Ordinary men. London: Penguin.
Carnahan, T. & McFarland, S. (2007). Revisiting the Stanford Prison Experiment. Personality and Social Psychology Bulletin, 33, 603–614.
Cesarani, D. (2004). Eichmann: His life and crimes. London: Heinemann.
Christensen, C.B. (2006). The women from Lublin: The guards of Majdenek. Historisk Tidsskrift, 106, 583–585.
Haney, C. Banks, C. & Zimbardo, P. (1973). Interpersonal dynamics in a simulated prison. International Journal of Criminology and Penology, 1, 69–97.
Haslam, S.A. & Reicher, S.D. (2007a). Beyond the banality of evil. Personality and Social Psychology Bulletin, 33, 615–622.
Haslam, S.A. & Reicher, S.D. (2007b). Identity entrepreneurship and the consequences of identity failure. Social Psychology Quarterly, 70, 125–147.
Hobsbawm, E. (1995) Age of extremes: The short twentieth century 1914–1991. London: Abacus.
Krueger, J.I. (in press). Lucifer’s last laugh. American Journal of Psychology.
Milgram, S. (1974). Obedience to authority. New York: Harper & Row.
Rees, L. (2005). Auschwitz: The Nazis and the ‘Final Solution’. London: BBC Books.
Reicher, S.D. & Haslam, S.A. (2006). Rethinking the psychology of tyranny: The BBC Prison Study. British Journal of Social Psychology, 45, 1–40.
Reicher, S., Hopkins, N., Levine, M. & Rath, R. (2006) Entrepreneurs of hate and entrepreneurs of solidarity. International Review of the Red Cross, 87, 621–637.
Sofsky, W. (1993). The order of terror. Princeton, NJ: Princeton University Press.
Staub, E. (in press). Evil. PsyCritiques.
Turner, J.C. (1991). Social influence. Milton Keynes: Open University Press.
Turner, J.C. (2006). Tyranny, freedom and social structure. British Journal of Social Psychology, 45, 41–46.
Zimbardo, P. (2007). The Lucifer effect. New York: Random House.
User avatar
nomo
 
Posts: 3388
Joined: Tue Jul 26, 2005 1:48 pm
Location: New York City
Blog: View Blog (0)

Re: Questioning the banality of evil

Postby elfismiles » Tue May 21, 2013 8:57 am

Just saw this last night...

WARNING: Abu Ghraib pics.


https://www.youtube.com/watch?v=OsFEV35tWsg

Philip Zimbardo: The psychology of evil
TEDtalksDirector·1,401 videos
Uploaded on Sep 23, 2008

http://www.ted.com Philip Zimbardo knows how easy it is for nice people to turn bad. In this talk, he shares insights and graphic unseen photos from the Abu Ghraib trials. Then he talks about the flip side: how easy it is to be a hero, and how we can rise to the challenge.


RigInt forum search:
search.php?keywords=zimbardo+lucifer++&terms=all&author=&sc=1&sf=all&sr=topics
User avatar
elfismiles
 
Posts: 8511
Joined: Fri Aug 11, 2006 6:46 pm
Blog: View Blog (4)

Re: Questioning the banality of evil

Postby Wombaticus Rex » Tue May 21, 2013 9:49 am

Fantastic, fantastic OP. What a read. A lot to unpack here.

Weird side note: Zimbardo sets off my shit-weasel radar, I find him very hard to trust or take seriously. Is that just me?
User avatar
Wombaticus Rex
 
Posts: 10896
Joined: Wed Nov 08, 2006 6:33 pm
Location: Vermontistan
Blog: View Blog (0)

Re: Questioning the banality of evil

Postby brekin » Tue May 21, 2013 9:34 pm

Wombacticus Rex wrote:

Weird side note: Zimbardo sets off my shit-weasel radar, I find him very hard to trust or take seriously. Is that just me?


Not just you. I've also wondered why more hasn't been made of this fact of who funded the prison experiment:

Q: Who funded the experiment?
A: The study was funded by a government grant from the U.S. Office of Naval Research to study antisocial behavior.
http://www.prisonexp.org/faq.htm


Also in an old thread we talked about Dr. Col. (ret.) Larry James and how Zimbardo wrote a preface to his book, Fixing Hell: An Army Psychologist Confronts Abu Ghraib and James has come under criticism for white washing his involvement in the environment that created Abu Ghraib & Guantánamo and then portraying himself as a savior. Read about it here (especially the counterpunch article):
viewtopic.php?f=8&t=24990&p=294266&hilit=zombardo#p294266

Strange company to be keeping. Also I wonder about Zimbardo's continuing ties to military intelligence. And really is the experiment he conducted anything that the military didn't know? You take complete strangers and you create a unequal power structure in a closed punitive system and abuses happen. What military doesn't have experience with this first hand for eons? In fact the military is usually so busy covering up instances of this that they would have enough material to make the Stanford experiment look like the strange little blip it was. Just one case in point:. James Michener during the war (II) while he was getting materiel for his Tales of the South Pacific was sent by the U.S. Navy to investigate a marine unit on an island that went completely gay spartan under the control of a devious sergeant who undermined the ranking officer and slowly created his own power cult. This was totally squashed by the navy and we only know about it from Michener's memoir: http://www.amazon.com/World-My-Home-Mem ... 0812978137

I think the (ok and I'm going to throw a conspiracy mary jane here) Stanford experiment was done to show the whiny Vietnam era college age kids that they were as capable of evil as any of the oppressors they were protesting against. Once that was established every college age and high school kid will now and forever in every Psych 101 class watch this propagandist video that basically tells them that for the sake of environment all of them would be sadistic torturers. When ironically after surviving junior high many (more or less) of them have already proven to not be complete sociopaths.

Also of note is that one of the main aggressors (if not the main) in the Stanford experiment admitted on camera that his impression was that Zimbardo wanted to see something about abuse of power and so that was what he decided to give him. To me that seems less like an experimental condition and an invitation to performance art. Well anyways, yeah, long story short, Zimbardo, I'd watch my wallet around him.
If I knew all mysteries and all knowledge, and have not charity, I am nothing. St. Paul
I hang onto my prejudices, they are the testicles of my mind. Eric Hoffer
User avatar
brekin
 
Posts: 3229
Joined: Tue Oct 09, 2007 5:21 pm
Blog: View Blog (1)

Re: Questioning the banality of evil

Postby Wombaticus Rex » Wed May 22, 2013 8:23 am

That Counterpunch piece was a revelation. I know way too many guys like that, and it's remarkable how little difference there is between the egomania of an internet marketing guru, a GOP party "visionary," and a celebrity intellectual college professor....oh, the jobs I've had so far...but I recognize that craven confabulating self-promotion. I've worked for Zimbardos before.

I also find your reading of the Stanford experiment to be pretty fascinating. I'm in no position to evaluate the veracity, of course, but it's insightful -- after all, most of the older folks who introduced me to this material did so in the precise context you're discussing. Their takeaway was a little different, though: they took it as evidence-level proof they had to get involved, politically and economically, in their communities and shape that learning environment before the Fascists could ruin Vermont.

Fun times.

Random (but not at all) Question: Has Martin Orne done similar work in this field? Off to use some fancy search tools...
User avatar
Wombaticus Rex
 
Posts: 10896
Joined: Wed Nov 08, 2006 6:33 pm
Location: Vermontistan
Blog: View Blog (0)

Re: Questioning the banality of evil

Postby bks » Wed May 22, 2013 10:16 am

Rex, do you have a copy of Memory, Trauma Treatment and the Law? You might want to get one if you don't. I knew you would happen onto Orne given your statements about your current research milieu. When I read the Ritzer diary ecerpt, I thought of Orne's work immediately.

I have notes about him from a project I undertook a couple of years ago, which produced email exchanges with Alan Scheflin and others about Orne. Orne was top staff at a psych hospital here in Philly that at which my sister was a patient in the 1980's, and he also ran this somewhat mysterious Philly-based Psych organization (mysterious to me at least) that I wanted to probe. Got started,lost a bit of steam when it looked like my thesis may have been misguided, but other things impinged and it's kind of still hanging there waiting to be finished.

My basic thesis was that Orne, given his CIA history, had an interest in seeing the FMSF's memory-counfounding project succeed, since it meant that the memories of the folks he had fucked with would be less believable in a court setting. I wondered: did the Orlikow and Sims cases spook him at all? Enough to align himself with the FMSF? Did he have realistic cause for concern that he might be a target for prosecution, and thus an interest in seeing recovered memories of abuse discredited? Scheflin (who battled with Orne) thought it was unlikely that this was his motivation, and offered a different assessment than mine (which I'm inclined to accept).

PM me if interested in more.
bks
 
Posts: 1093
Joined: Thu Jul 19, 2007 2:44 am
Blog: View Blog (0)

Re: Questioning the banality of evil

Postby Elvis » Thu Jan 29, 2015 3:01 pm

I studied Milgram's "Obedience to Authority" in college (still have it), naturally it made a big impression on me, reread it years later, so I perked up when I saw this new Atlantic piece about it:

http://www.theatlantic.com/health/archi ... ts/384913/

Rethinking One of Psychology's Most Infamous Experiments

In the 1960s, Stanley Milgram's electric-shock studies showed that people will obey even the most abhorrent of orders. But recently, researchers have begin to question his conclusions—and offer some of their own.

Cari Romm Jan 28 2015, 12:23 PM ET

In 1961, Yale University psychology professor Stanley Milgram placed an advertisement in the New Haven Register. “We will pay you $4 for one hour of your time,” it read, asking for “500 New Haven men to help us complete a scientific study of memory and learning.”

Only part of that was true. Over the next two years, hundreds of people showed up at Milgram’s lab for a learning and memory study that quickly turned into something else entirely. Under the watch of the experimenter, the volunteer—dubbed “the teacher”—would read out strings of words to his partner, “the learner,” who was hooked up to an electric-shock machine in the other room. Each time the learner made a mistake in repeating the words, the teacher was to deliver a shock of increasing intensity, starting at 15 volts (labeled “slight shock” on the machine) and going all the way up to 450 volts (“Danger: severe shock”). Some people, horrified at what they were being asked to do, stopped the experiment early, defying their supervisor’s urging to go on; others continued up to 450 volts, even as the learner pled for mercy, yelled a warning about his heart condition—and then fell alarmingly silent. In the most well-known variation of the experiment, a full 65 percent of people went all the way.

Until they emerged from the lab, the participants didn’t know that the shocks weren’t real, that the cries of pain were pre-recorded, and that the learner—railroad auditor Jim McDonough—was in on the whole thing, sitting alive and unharmed in the next room. They were also unaware that they had just been used to prove the claim that would soon make Milgram famous: that ordinary people, under the direction of an authority figure, would obey just about any order they were given, even to torture. It’s a phenomenon that’s been used to explain atrocities from the Holocaust to the Vietnam War’s My Lai massacre to the abuse of prisoners at Abu Ghraib. “To a remarkable degree,” Peter Baker wrote in Pacific Standard in 2013, “Milgram’s early research has come to serve as a kind of all-purpose lightning rod for discussions about the human heart of darkness.”

In some ways, though, Milgram’s study is also—as promised—a study of memory, if not the one he pretended it was.

More than five decades after it was first published in the Journal of Abnormal and Social Psychology in 1963, it’s earned a place as one of the most famous experiments of the 20th century. Milgram’s research has spawned countless spinoff studies among psychologists, sociologists, and historians, even as it’s leapt from academia into the realm of pop culture. It’s inspired songs by Peter Gabriel (lyrics: “We do what we’re told/We do what we’re told/Told to do”) and Dar Williams (“When I knew it was wrong, I played it just like a game/I pressed the buzzer”); a number of books whose titles make puns out of the word “shocking”; a controversial French documentary disguised as a game show; episodes of Law and Order and Bones; a made-for-TV movie with William Shatner; a jewelry collection (bizarrely) from the company Enfants Perdus; and most recently, the biopic The Experimenter, starring Peter Sarsgaard as the title character—and this list is by no means exhaustive.

But as with human memory, the study—even published, archived, enshrined in psychology textbooks—is malleable. And in the past few years, a new wave of researchers have dedicated themselves to reshaping it, arguing that Milgram’s lessons on human obedience are, in fact, misremembered—that his work doesn’t prove what he claimed it does.

The problem is, no one can really agree on what it proves instead.

* * *

To mark the 50th anniversary of the experiments’ publication (or, technically, the 51st), the Journal of Social Issues released a themed edition in September 2014 dedicated to all things Milgram. “There is a compelling and timely case for reexamining Milgram’s legacy,” the editors wrote in the introduction, noting that they were in good company: In 1964, the year after the experiments were published, fewer than 10 published studies referenced Milgram’s work; in 2012, that number was more than 60.

It’s a trend that surely would have pleased Milgram, who crafted his work with an audience in mind from the beginning. “Milgram was a fantastic dramaturg. His studies are fantastic little pieces of theater. They’re beautifully scripted,” said Stephen Reicher, a professor of psychology at the University of St. Andrews and a co-editor of the Journal of Social Issues’ special edition. Capitalizing on the fame his 1963 publication earned him, Milgram went on to publish a book on his experiments in 1974 and a documentary, Obedience, with footage from the original experiments.

But for a man determined to leave a lasting legacy, Milgram also made it remarkably easy for people to pick it apart. The Yale University archives contain boxes upon boxes of papers, videos, and audio recordings, an entire career carefully documented for posterity. Though Milgram’s widow Alexandra donated the materials after his death in 1984, they remained largely untouched for years, until Yale’s library staff began to digitize all the materials in the early 2000s. Able to easily access troves of material for the first time, the researchers came flocking.

“There’s a lot of dirty laundry in those archives,” said Arthur Miller, a professor emeritus of psychology at Miami University and another co-editor of the Journal of Social Issues. “Critics of Milgram seem to want to—and do—find material in these archives that makes Milgram look bad or unethical or, in some cases, a liar.”

One of the most vocal of those critics is Australian author and psychologist Gina Perry, who documented her experience tracking down Milgram’s research participants in her 2013 book Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments. Her project began as an effort to write about the experiments from the perspective of the participants—but when she went back through the archives to confirm some of their stories, she said, she found some glaring issues with Milgram’s data. Among her accusations: that the supervisors went off script in their prods to the teachers, that some of the volunteers were aware that the setup was a hoax, and that others weren’t debriefed on the whole thing until months later. “My main issue is that methodologically, there have been so many problems with Milgram’s research that we have to start re-examining the textbook descriptions of the research,” she said.

But many psychologists argue that even with methodological holes and moral lapses, the basic finding of Milgram’s work, the rate of obedience, still holds up. Because of the ethical challenge of reproducing the study, the idea survived for decades on a mix of good faith and partial replications—one study had participants administer their shocks in a virtual-reality system, for example—until 2007, when ABC collaborated with Santa Clara University psychologist Jerry Burger to replicate Milgram’s experiment for an episode of the TV show Basic Instincts titled “The Science of Evil,” pegged to Abu Ghraib.

Burger’s way around an ethical breach: In the most well-known experiment, he found, 80 percent of the participants who reached a 150-volt shock continued all the way to the end. “So what I said we could do is take people up to the 150-volt point, see how they reacted, and end the study right there,” he said. The rest of the setup was nearly identical to Milgram’s lab of the early 1960s (with one notable exception: “Milgram had a gray lab coat and I couldn’t find a gray, so I got a light blue.”)

At the end of the experiment, Burger was left with an obedience rate around the same as the one Milgram had recorded—proving, he said, not only that Milgram’s numbers had been accurate, but that his work was as relevant as ever. “[The results] didn’t surprise me,” he said, “but for years I had heard from my students and from other people, ‘Well, that was back in the 60s, and somehow how we’re more aware of the problems of blind obedience, and people have changed.’”

In recent years, though, much of the attention has focused less on supporting or discrediting Milgram’s statistics, and more on rethinking his conclusions. With a paper published earlier this month in the British Journal of Social Psychology, Matthew Hollander, a sociology Ph.D. candidate at the University of Wisconsin, is among the most recent to question Milgram’s notion of obedience. After analyzing the conversation patterns from audio recordings of 117 study participants, Hollander found that Milgram’s original classification of his subjects—either obedient or disobedient—failed to capture the true dynamics of the situation. Rather, he argued, people in both categories tried several different forms of protest—those who successfully ended the experiment early were simply better at resisting than the ones that continued shocking.

“Research subjects may say things like ‘I can’t do this anymore’ or ‘I’m not going to do this anymore,’” he said, even those who went all the way to 450 volts. “I understand those practices to be a way of trying to stop the experiment in a relatively aggressive, direct, and explicit way.”

It’s a far cry from Milgram’s idea that the capacity for evil lies dormant in everyone, ready to be awakened with the right set of circumstances. The ability to disobey toxic orders, Hollander said, is a skill that can be taught like any other—all a person needs to learn is what to say and how to say it.

* * *

In some ways, the conclusions Milgram drew were as much a product of their time as they were a product of his research. At the time he began his studies, the trial of Adolf Eichmann, one of the major architects of the Holocaust, was already in full swing. In 1963, the same year that Milgram published his studies, writer Hannah Arendt coined the phrase “the banality of evil” to describe Eichmann in her book on the trial, Eichmann in Jerusalem.

Milgram, who was born in New York City in 1933 to Jewish immigrant parents, came to view his studies as a validation of Arendt’s idea—but the Holocaust had been at the forefront of his mind for years before either of them published their work. “I should have been born into the German-speaking Jewish community of Prague in 1922 and died in a gas chamber some 20 years later,” he wrote in a letter to a friend in 1958. “How I came to be born in the Bronx Hospital, I’ll never quite understand.”

And in the introduction of his 1963 paper, he invoked the Nazis within the first few paragraphs: “Obedience, as a determinant of behavior, is of particular relevance to our time,” he wrote. “Gas chambers were built, death camps were guarded; daily quotas of corpses were produced … These inhumane policies may have originated in the mind of a single person, but they could only be carried out on a massive scale if a very large number of persons obeyed orders.”

Though the term didn’t exist at the time, Milgram was a proponent of what today’s social psychologists call situationism: the idea that people’s behavior is determined largely by what’s happening around them. “They’re not psychopaths, and they’re not hostile, and they’re not aggressive or deranged. They’re just people, like you and me,” Miller said. “If you put us in certain situations, we’re more likely to be racist or sexist, or we may lie, or we may cheat. There are studies that show this, thousands and thousands of studies that document the many unsavory aspects of most people.”

But continued to its logical extreme, situationism “has an exonerating effect,” he said. “In the minds of a lot of people, it tends to excuse the bad behavior … it’s not the person’s fault for doing the bad thing, it’s the situation they were put in.” Milgram’s studies were famous because their implications were also devastating: If the Nazis were just following orders, then he had proved that anyone at all could be a Nazi. If the guards at Abu Ghraib were just following orders, then anyone was capable of torture.

The latter, Reicher said, is part of why interest in Milgram’s work has seen a resurgence in recent years. “If you look at acts of human atrocity, they’ve hardly diminished over time,” he said, and news of the abuse at Abu Ghraib was surfacing around the same time that Yale’s archival material was digitized, a perfect storm of encouragement for scholars to turn their attention once again to the question of what causes evil.

He and his colleague Alex Haslan, the third co-editor of The Journal of Social Issues’ Milgram edition and a professor of psychology at the University of Queensland, have come up with a different answer. “The notion that we somehow automatically obey authority, that we are somehow programmed, doesn’t account for the variability [in rates of obedience] across conditions,” he said; in some iterations of Milgram’s study, the rate of compliance was close to 100 percent, while in others it was closer to zero. “We need an account that can explain the variability—when we obey, when we don’t.”

“We argue that the answer to that question is a matter of identification,” he continued. “Do they identify more with the cause of science, and listen to the experimenter as a legitimate representative of science, or do they identify more with the learner as an ordinary person? … You’re torn between these different voices. Who do you listen to?”

The question, he conceded, applies as much to the study of Milgram today as it does to what went on in his lab. “Trying to get a consensus among academics is like herding cats,” Reicher said, but “if there is a consensus, it’s that we need a new explanation. I think nearly everybody accepts the fact that Milgram discovered a remarkable phenomenon, but he didn’t provide a very compelling explanation of that phenomenon.”

What he provided instead was a difficult and deeply uncomfortable set of questions—and his research, flawed as it is, endures not because it clarifies the causes of human atrocities, but because it confuses more than it answers.

Or, as Miller put it: “The whole thing exists in terms of its controversy, how it’s excited some and infuriated others. People have tried to knock it down, and it always comes up standing.”


Cari Romm is an editorial fellow with The Atlantic​.
“The purpose of studying economics is not to acquire a set of ready-made answers to economic questions, but to learn how to avoid being deceived by economists.” ― Joan Robinson
User avatar
Elvis
 
Posts: 7413
Joined: Fri Apr 11, 2008 7:24 pm
Blog: View Blog (0)


Return to General Discussion

Who is online

Users browsing this forum: No registered users and 41 guests