An even more disquieting study asked a group of subjects to play a game known as “prisoner’s dilemma,” which requires pairs of people to choose to either cooperate with each other or “defect” (act uncooperatively). You get the bigger payoff if you and the other person both cooperate (say, $10), an intermediate reward (say, $3) if you defect and your opponent cooperates, and no reward if you both defect. The general procedure is a staple in psychology research; the catch in this particular study was that before people began to play the game, they sat in a waiting room where an ostensibly unrelated news broadcast was playing in the background. Some subjects heard prosocial news (about a clergyman donating a kidney to a needy patient); others, by contrast, heard a broadcast about a clergyman committing murder. What happened? You guessed it: people who heard about the good clergyman were a lot more cooperative than those who heard about the bad clergyman.
In all these studies, emotions of one sort or another prime memories, and those memories in turn shape choice. A different sort of illustration comes from what economist George Loewenstein calls “the attraction of the visceral.” It’s one thing to turn down chocolate cheesecake in the abstract, another when the waiter brings in the dessert cart. College students who are asked whether they’d risk wasting 30 minutes in exchange for a chance to win all the freshly baked chocolate chip cookies they could eat are more likely to say yes if they actually see (and smell) the cookies than if they are merely told about them.
Hunger, however, is nothing compared to lust. A follow-up study exposed young men to either a written or a (more visceral) filmed scenario depicting a couple who had met earlier in the evening and are now discussing the possibility of (imminently) having sex. Both are in favor, but neither party has a condom, and there is no store nearby. The woman reports that she is taking a contraceptive pill and is disease-free; she leaves it up to the man to decide whether to proceed, unprotected. Subjects were then asked to rate their own probability of having unprotected sex if they were in the male character’s shoes. Guess which group of men — readers or video watchers — was more likely to throw caution to the wind? (Undergraduate men are also apparently able to persuade themselves that their risk of contracting a sexually transmitted disease goes down precisely as the attractiveness of their potential partner goes up.) The notion that men might think with organs below the brain is not new, but the experimental evidence highlights rather vividly the degree to which our choices don’t necessarily follow from purely “rational” considerations. Hunger, lust, happiness, and sadness are all factors that most of us would say shouldn’t enter into rational thought. Yet evolution’s progressive overlay of technology has guaranteed that each wields an influence, even when we insist otherwise.
The clumsiness of our decision-making ability becomes especially clear when we consider moral choices. Suppose, for example, that a runaway trolley is about to run over and kill five people. You (and you alone) are in a position such that you can hit a switch to divert the trolley onto a different set of tracks, where it would kill only one person instead of five. Do you hit the switch?
Now, suppose instead that you are on a footbridge, standing above the track that bears the runaway trolley. This time, saving the five people would require you to push a rather large person (considerably bigger than you, so don’t bother to volunteer yourself) off the footbridge and into the oncoming trolley. The large person in question would, should you toss him over, die, allowing the other five to survive. Would thatbe okay? Although most people answer yes to the scenario involving the switch, most people say no to pushing someone off the footbridge — even though in both cases five lives are saved at the cost of one.
Why the difference? Nobody knows for sure, but part of the answer seems to be that there is something more visceral about the second scenario; it’s one thing to flip a switch, which is inanimate and somewhat removed from the actual collision, and another to forcibly send someone to his death.
One historical example of how visceral feelings affect moral choice is the unofficial truce called by British and German soldiers during Christmas 1914, early in World War I. The original intention was to resume battle afterward, but the soldiers got to know one another during the truce; some even shared a Christmas meal. In so doing, they shifted from conceptualizing one another as enemies to seeing each other as flesh-and-blood individuals. The consequence was that after the Christmas truce, the soldiers were no longer able to kill one another. As the former president Jimmy Carter put it in his Nobel Peace Prize lecture (2002), “In order for us human beings to commit ourselves personally to the inhumanity of war, we find it necessary first to dehumanize our opponents.”
Both the trolley problem and the Christmas truce remind us that though our moral choices may seem to be the product of a single process of deliberative reasoning, our gut, in the end, often also plays a huge role, whether we are speaking of something mundane, like a new car, or making decisions with lives at stake.
The trolley scenarios illustrate the split by showing how we can get two different answers to essentially the same question, depending on which system we tap into. The psychologist Jonathan Haidt has tried to go a step further, arguing that we can have strong moral intuitions even when we can’t back them up with explicit reasons. Consider, for example, the following scenario:
Julie and Mark are brother and sister. They are traveling together in France on summer vacation from college. One night they are staying alone in a cabin near the beach. They decide that it would be interesting and fun if they tried making love. At the very least it would be a new experience for each of them. Julie was already taking birth control pills, but Mark uses a condom too, just to be safe. They both enjoy making love, but they decide not to do it again. They keep that night as a special secret, which makes them feel even closer to each other. What do you think about that? Was it okay for them to make love?
Every time I read this passage, I get the creeps. But why exactly is it wrong? As Haidt describes it, most people who hear the above story immediately say that it was wrong for the siblings to make love, and they then begin searching for reasons. They point out the dangers of inbreeding, only to remember that Julie and Mark used two forms of birth control. They argue that Julie and Mark will be hurt, perhaps emotionally, even though the story makes it clear that no harm befell them. Eventually, many people say something like “I don’t know, I can’t explain it, I just know it’s wrong.”
Haidt calls this phenomenon — where we feel certain that something is wrong but are at a complete loss to explain why — “moral dumbfounding.” I call it an illustration of how the emotional and the judicious can easily decouple. What makes moral dumbfounding possible is the split between our ancestral system — which looks at an overall picture without being analytical about the details — and a judicious system, which can parse things piece by piece. As is so often the case, where there is conflict, the ancestral system wins: even though we know we can’t give a good reason, our emotional queasiness lingers.
When you look inside the skull, using neuroimaging, you find further evidence that our moral judgments derive from two distinct sources: people’s choices on moral dilemmas correlate with how they use their brains. In experimental trials like those mentioned earlier, the subjects who chose to save five lives at the expense of one tended to rely on the regions of the brain known as the dorsolateral prefrontal cortex and the posterior parietal cortex, which are known to be important for deliberative reasoning. On the other hand, people who decided in favor of the single individual at the cost of five tended to rely more on regions of the limbic cortex, which are more closely tied to emotion.[26]
26
Fans of the history of neuroscience will recognize this as the brain region that was skewered in the brain of one Phineas Gage, injured on September 13, 1848.