On the television show The Bachelor, Rachel lies to her fellow contestantsabout last night's date. Over on The Amazing Race, Jonathan shoves hiswife after she slows them down en route to the finish line. On TheApprentice, Maria attacks Wes, then Donald Trump fires them both.
Injust a few years, more than 100 reality television shows have beenstriving to help contestants act like jerks, and audiences love it.
Sure, contestants sometimes form noble alliances, and the occasionalromance blossoms, but the behavior that viewers talk about the next dayat the watercooler invariably involves contestants behaving maliciouslyor embarrassing themselves by cracking under pressure. Although it'sclear that participants are purposely placed in coercive situations, wenonetheless think we are seeing something real and noteworthy about thecharacter and the psychology of fellow humans.
Perhapsthat fascination explains why so many experiments in the field ofpsychology read like the premise for a reality TV series. Consider themost famous of all social psychology experiments, Stanley Milgram's"Behavioral Study of Obedience," published in 1963. After answering anewspaper ad, volunteers (all men) arrive at a Yale Universitylaboratory, where a man in a gray lab coat asks for help in a "learningexperiment." The subject is instructed to administer a shock to astranger in an adjoining room when the stranger answers a questionincorrectly. The shocks are mild at first, but after each wrong answerthe experimenter asks the subject to deliver a stronger voltage. Thecries from the stranger in the other room grow more agonized as theshock is increased in 15-volt increments. (The shocks aren't real; the"stranger" is merely acting.) If the subject hesitates, the man in thelab coat says sternly, "Please continue." If the subject still balks,he is first told, "The experiment requires that you go on," then, "Itis absolutely essential that you continue," and then, "You have noother choice, you must go on."
Bythe time the subjects deliver what they believe to be a "very strongshock," some are sweating, trembling, stuttering, or biting their lips.In the most interesting reaction, which would have made for greattelevision, some of the subjects experience uncontrollable fits ofnervous laughter. One 46-year-old encyclopedia salesman is so overcomeby a seizure of laughter that the experiment has to be stopped to allowhim to recover.
Whatdrew attention to Milgram's paper was his report that most of therandomly selected men were coaxed into hitting a switch labeled"Danger: Severe Shock," administering a supposed 420-volt zap. Milgramwas surprised that although "subjects have learned from childhood thatit is a fundamental breach of moral conduct to hurt another personagainst his will," most were willing to do so.
Milgramwas inspired to figure out why prison guards at World War II Nazi deathcamps willingly followed horrifying orders. That question still ringsout today, not only on TV shows like Survivor or The Apprentice butalso on the network news, as corporate executives steal millions,terrorists behead innocents, and prison-camp guards in Afghanistan,Iraq, and Cuba mistreat inmates. We are fascinated, troubled, anddesperate to know how human behavior can go so wrong, fearful that we,too, might behave badly in a similar situation.
Formore than a century, psychologists have attempted to get to the root ofevil and error. What they have discovered is not encouraging. Milgramand earlier researchers demonstrated that the ability to act rationallycan be subverted by crowds or by pressure from authority figures.Recent studies show that humans, even when left alone, are prone tobewildering mistakes and biases.
"Basically,the job of the social psychologist has been to demonstrate how peoplescrew up," says Joachim Krueger, associate professor of psychology atBrown University. By night, he has been mesmerized by both Survivorand, more recently, by the naked ambition and displays of status on TheApprentice. By day, however, he has become convinced that misconduct isonly half the story. Evil and error, he argues, cannot be graspedwithout first understanding why humans often do the right thing. If heis correct, the first century of social psychology study may one day belikened to the early days of medicine, when doctors sought cures fordiseases by practicing procedures like trepanning without any trueinkling of how the body functions.
Recently,Krueger and a colleague, David Funder at the University of Californiaat Riverside, published a paper calling for a reorientation of thefield. Without a greater effort to examine how humans do things well,they argue, a "distorted view" emerges that "yields a cynical outlookon human nature." Another researcher summarized their argument thisway: Krueger and Funder are asking researchers to abandon the "peopleare stupid school of social psychology."
Krueger is tall and soft-spoken, his voice accented by his native German. Hiscinder-block office is neat and unadorned. One afternoon, to explainwhy social psychology became so obsessed with human errors and why thatobsession may itself be in error, Krueger began pulling books off hisshelves, offering a trip through the history of this science.
Socialpsychology crystallized in the 19th century around a concern with crowdbehavior: Why do otherwise reasonable individuals become irrational oreven dangerous when placed in a mob of people? By the middle of the20th century, social psychologists had widened their research toexamine how people can be influenced to make incorrect judgments orcross moral boundaries. In the 1950s, Solomon Asch, a pioneer in socialpsychology, pitted naive test subjects against a group of strangers whomade bizarre judgments about the relative lengths of lines. Pressuredto conform to the group, subjects often disregarded the obvious visualevidence and adopted the prevailing judgment.
Aboutthe time of Milgram's experiment, Princeton University professor JohnDarley studied why bystanders, when confronted with strangers indistress, sometimes respond by walking away or closing the drapes.Inspired by the case of Kitty Genovese, a New York City murder victimwhose cries for help failed to rouse her neighbors to action, Darleyshowed that test subjects were less likely to aid a stranger if theythought they were just one among several witnesses.
Despiteevidence of sheeplike behavior, many researchers still assumed thatindividuals, on their own, could be counted on to be rational andmoral. The sea change came in the 1970s, from insights gleaned througheconomics research. In a series of articles and books, psychologistsDaniel Kahneman, who later won the Nobel Prize in Economics, and AmosTversky rejected the long-held notion that humans are rational actorsin a marketplace. Rather than using all the information available andcalculating the best decision, they argued, the human mind relies on"quick and dirty" heuristics, mental shortcuts or rules of thumb, tomake decisions.
Socialpsychologists, including Krueger, jumped in to investigate these rulesof thumb. Because the rules aren't always rational, researchers thoughtthey would be exposed in situations where test subjects were led tomake mistakes. In effect, the psychologists started looking forerrors—and for experiments that would prompt them to occur.
"Likemany other graduate students, I thought this stuff was so cool,"Krueger says, holding a book containing some of Kahneman and Tversky'swork. "The task before us was to set up experiments that would showerrors and biases, and those mistakes would tell us what was reallygoing on with human cognition. Of course, what was really going on wasalways something bad—a departure from some researcher's idea of how themind should work."
Krueger'sinterest was stereotyping. In the late 1980s and early 1990s, hepublished papers showing how people use arbitrary categories to makejudgments. On hot August days, for instance, people look forward to thefirst day in September, as if turning a page on the calendar wouldsuddenly make the weather cooler. Krueger found that people make twoerrors in this case: They underestimate temperature changes within amonth (assuming, for instance, that August will be uniformly hot) andoverestimate the changes in temperature that will occur when the monthends.
Sincethen, revelations of human misperception and bias have popped up in thesocial psychology studies like toadstools after a rain. We humans havea variety of ways of perceiving ourselves as smarter, more skilled, andmore appealing than we are in reality. Most drivers, for example, saythey drive more safely than the average person, even though that is astatistical impossibility. People also tend to consider themselves moreattractive than others say they are. We tend to underestimate thechance that past events will reoccur, like winning two poker hands in arow (the "hot hand" fallacy). Likewise, we incorrectly assume thatbecause a basketball player has made the last five shots he will makethe sixth. We overestimate small risks, like being killed by aterrorist, yet underestimate much larger ones, like being killed in atraffic accident.
Thelist goes on: the "hindsight bias," the "systematic distortion effect,"the "false uniqueness effect," the "just world bias," the "cloudedjudgment effect," and the "external agency illusion." And just in caseyou think you're hip to your own biases, researchers have unveiled the"bias blind spot," in which you see biases in others but overlook themin yourself.
Taking thisresearch at face value, one might conclude that when people are notmisjudging the world around them, they are lying to themselves abouttheir own abilities and motivations. In one famous study, people werefound to be "insensitive," beset by "ignorance," "generalmisconceptions," and a whole range of "shortcomings and biases."Krueger remembers a popular debate among social psychologists overwhich metaphor best drives home the depth of the mind's failings:Should researchers view the mind as a "cognitive miser," emphasizingour limited resources and reliance on irrelevant clues, or is the mindmore accurately depicted as a "totalitarian ego," pursuing self-esteemat the cost of self-deception? Is your mind a Scrooge or a Stalin?
Bythe mid-1990s, Krueger began to wonder about the value of findingmistakes in human reasoning. His daughter was a toddler, and like manyparents, he had become fascinated with her development. "I was overawedwith the day-to-day advances in her thinking," he recalls. "What I wasadmiring was not her rational thought but her development of intuitive,associative, and automatic reasoning. In other words, I was admiringthe same kind of thinking that social psychology researchers werefinding fault with when they studied adults.
Washuman reasoning really so flawed? Perhaps the errors lay in the meansby which psychologists sought to explain them. Human thinking, Kruegernotes, is of two broad types. There are the snap judgments we make onthe fly, like assessing whether a person approaching us on the streetis welcoming or threatening. And there are the activities to which weapply the full force of our minds, like preparing a businesspresentation or solving a math problem. That laborious reasoning haslong been assumed to represent the gold standard of human thinking. Itis the type of reasoning that social psychologists themselves employ.Test subjects, however, are typically placed in a situation andrequired to guess, react, or estimate. Later, the researcher analyzesthe behavior at length, through the lens of statistics or logic.Whenever there is a disparity, the test subject is assumed to bedisplaying the error or bias, not the researcher.
Anotherproblem with the studies, Krueger says, is that researchers are"null-hypothesis testing." Basically, they begin with the premise thatthe human mind is rational and then look for any deviation. Goodbehavior or moments of rationality are ignored because the intent is tostudy bad behavior. It's not unlike reality television: Unless there issome bad behavior, the research has nothing to show.
"Ibegan to think that by comparing human judgment to objective reality,we were missing a bigger picture," says Krueger. "We were chroniclingmistakes but stopping short of asking why such behavioral or cognitivetendencies existed or what general purpose they might serve. I began tothink that bias and error couldn't be the end of the story."
The mind wants to believe that the line between good and bad behavior isclear. Looking again at the Milgram shock experiment, one wants toconsider the subjects who administered "shocks" under order as cowardsand those who refused as heroes. But imagine a different Milgram study.What if, when subjects showed up at the lab, they instead wereconfronted with smoke pouring out of the windows and a firefighter whotold them, "Quick, help me carry this hose into this burning building."What would we think of those who followed authority in that situation?What would we think of those who refused?
Itis an uncomfortable fact that the soldiers who ran the Nazi death campsand the soldiers who liberated them were all acting under orders fromsuperiors. There is a world of difference in the moral implications ofwhat they did, but the human tendency to obey authority resulted inboth evil and good. Krueger's challenging question is this: Wouldn'tscientists learn as much or more about mental mechanisms like obedienceif they took its advantages into account? Couldn't we learn more aboutthe bad by studying the good, or at least by examining bad and goodbehavior in the same context?
Arethinking of one particular classic of error research had a dramaticinfluence on Krueger's thinking. In a now famous study, Lee Ross andcolleagues at Stanford University asked students if they would walkaround campus wearing a sandwich board that read "Eat at Joe's." Thetest subjects who agreed to do this embarrassing task predicted that 62percent of others approached to carry the sign would do it. But testsubjects who refused to carry the sign thought that only 33 percent ofothers would agree to do it when asked. Researchers concluded that theyhad found a new bias in reasoning, which they called the "falseconsensus effect"—that people have the naive tendency to project theirindividual attitudes, values, and behaviors onto the majority.
Kruegerwas impressed by a critique of the study. Robyn Dawes, a professorKrueger had studied under, countered that the students who predictedthat their opinions would be in the majority were not making an errorat all but rather were taking their own opinion as a legitimate pieceof data. "By definition, most people are in the majority most of thetime," explains Krueger. "Therefore, if you assume that your opinionwill match that of the majority, you will be right more often thannot."
Kruegerhas taken this thinking a step further, to study the personal andsocial benefits of such behavior. In doing so, he may have cracked the"prisoner's dilemma," a classic experiment of both social psychologyand economics. In the prisoner's dilemma, you are asked to imagineyourself alone in a cell, with an unseen companion isolated in aseparate cell. You both are under suspicion of having committed a crimetogether, but the police don't have the evidence to convict you—yet. Ifyou agree to betray your companion by testifying against him and hechooses to remain silent, you will be freed (zero years); if you bothrat on each other, you receive a near-maximum sentence of three years.If you remain silent, and your companion does, too, you both receiveonly minimal time (one year), but if you stay quiet and your companionbetrays you, you receive full punishment—five years, the sucker'soutcome. Which choice, betrayal or silence, assures you the least time?
Many researchers have assumed that the logical choice is betrayal, sinceyour potential outcomes, depending on what the other prisoner does, arezero or three years—less time on average than the consequences ofstaying silent (one or five years). Yet when faced with this problem,most laypeople make the illogical choice to remain silent. Why?
Theanswer, Krueger believes, is that they are employing social projection:They assume that the second prisoner will act the same way they will,and then they incorporate that assumption into the decision-makingprocess. By that reasoning, the choice comes down to mutual betrayal(three years) or mutual cooperation (one year). Cooperation becomes thelogical choice.
Themind-bending part of Krueger's theory is that participants are assumingthat other people will act like them before they themselves decide howthey are going to act. People don't decide on a strategy and thenassume people will act similarly. Rather, they assume similarity andthen act on that assumption. Krueger believes this may explain why wedo many socially conscious acts, such as taking time to vote eventhough we know that our individual vote probably won't make adifference. The assumption that people will act like us actuallyinfluences our decision to participate.
"Theresult is that there are higher levels of cooperation in groups wherepeople project their beliefs on others," says Krueger. "The collectivegood is a by-product of this. In this model there is no conflictbetween acting selfishly and acting for the public good. The lattercomes from the former."
Talking off the Brown campus one evening, I ask Krueger about evil. If humanreasoning has all these heretofore unknown positive aspects, how doesone account for the horrors on the nightly news? Does social psychologyhave any hope of really understanding human misbehavior?
I'mnot alone in wondering. Commenting on Krueger and Funder's paper,developmental psychologist Michael Maratsos of the University ofMinnesota argues that the truly troubling revelation of Milgram'sexperiment was the extent of conformity and cruelty, "given how littlethe subjects had at stake." Throughout history, people have willinglydone horrible things to avoid punishment or gain status; Maratsos citesfoot binding, slavery, and recent corporate scandals as examples. Isn'tit reasonable to begin studying humans, as Maratsos does, with theconclusion that people are "basically a disappointment"?
Onthe subject of morality, Krueger seems uncomfortable. He has talkedadmiringly, almost longingly, of research on vision, where issues of"good" and "bad" don't apply. No one expresses alarm when a researcherfigures out a way to trick our visual perceptions. Visualmisperceptions produced in the laboratory are assumed to reveal themechanisms by which vision functions well in the real world. That isn'tso with the science of human interactions.
"I'mnot making the case that human behavior is wonderful and is the way itshould be," Krueger says at last. "What I'm saying is that the fieldhas been out of balance in pursuing errors and biases, and because ofthat we don't know as much about either the good or the bad behavior aswe should. You can't understand the bad without understanding the good."
AsKrueger and I walk, our attention is drawn across the street. A groupof high school students is gathered at a bus stop. Suddenly there is aquick movement, some shouting, and a young man jumps up and beginsrunning. We stand there straining to determine what is happening. Arethe voices raised in distress? Is the young man running in retreat? Myfirst thought is that I am the subject of a social psychologyexperiment. I glance at Krueger, then around me. Are there cameras orgrad students hidden in the bushes, recording my reaction?
Wewalk on, but I'm slightly shaken. Should we have done something? Weagree that there was nothing to do, but someone apparently thoughtdifferently: A police car soon comes by with its siren on. Over dinner,I harangue Krueger with questions. What was his impression of what wesaw? How did that situation compare to the classic studies of bystanderintervention? All the questions boil down to one: Did I do the rightthing?
"Most things that happen to you in the world happen quickly," Krueger says.
"Fortunately,our fast and frugal reasoning tends to serve us very well in the longrun. Life is an experiment without a control group. You will never knowhow your actions would be different had the situation been slightlydifferent. That's why we do experiments. One thing is for certain: Youcan't carry all the research around and have a bird's-eye view of your
own behavior in every moment. You'd break down."
Perhaps,I suggest, there is solace, even absolution, to be gained by viewinghuman misbehavior in a wider context. "Yes," Krueger says. "I think thenext wave of research will take us to a place of greater balance andacceptance. If we come to a more realistic and accurateself-understanding, we may be better able to forgive ourselves andothers."
Ofcourse, nothing will stop us from categorizing behavior as right,wrong, good, or evil. But understanding behavior and judging it are twodifferent tasks; the first is scientific, the second is not. When itcomes to understanding, it might be more fruitful to approach ourselveswith wonderment instead of disappointment.
"Iwatch my kids," Krueger says, "and even when they are doing somethingthat annoys me, I'm thinking that they are acting just the way theyshould, as the highly evolved mammal that they are. There is a Zenmaster who said something like 'Humans are perfect, but they could usea little improvement.' To the Aristotelian mind that idea would be acontradiction; it would be gibberish. To me it has great appeal."
Towardsa Balanced Social Psychology: Causes, Consequences and Cures for theProblem-seeking Approach to Social Behavior and Cognition. J. I.Krueger and D. C. Funder in Behavioral and Brain Sciences. Vol. 27, No.3, pages 313–327; June 2004.