To consider something well, of course, is to evaluate both sides of an argument, but unless we go the extra mile of deliberately foreing ourselves to consider alternatives — not something that comes naturally — we are more prone to recall evidence consistent with an accepted proposition than evidence inconsistent with it. And since we most clearly remember information that seems consistent with our beliefs, it becomes very hard to let those beliefs go, even when they are erroneous.
The same, of course, goes for scientists. The aim of science is to take a balanced approach to evidence, but scientists are human beings, and human beings can’t help but notice evidence that confirms their own theories. Read any science texts from the past and you will stumble on not only geniuses, but also people who in hindsight seem like crackpots — flat-earthers, alchemists, and so forth. History is not kind to scientists who believed in such fictions, but a realist might recognize that in a species so dependent on memory driven by context, such slip-ups are always a risk.
In 1913 Eleanor Porter wrote one of the more influential children’s novels of the twentieth century, Pollyanna, a story of a girl who looked on the bright side of every situation. Over time, the name Pollyanna has become a commonly used term with two different connotations. It’s used in a positive way to describe eternal optimists and in a negative way to describe people whose optimism exceeds the rational bounds of reality. Pollyanna may have been a fictional character, but there’s a little bit of her in all of us, a tendency to perceive the world in positive ways that may or may not match reality. Generals and presidents fight on in wars that can’t be won, and scientists retain beliefs in pet theories long after the weight of evidence is stacked against them.
Consider the following study, conducted by the late Ziva Kunda. A group of subjects comes into the lab. They are told they’ll be playing a trivia game; before they play, they get to watch someone else, who, they are told, will play either on their team (half the subjects hear this) or on the opposite team (that’s what the other half are told). Unbeknownst to the subjects, the game is rigged; the person they’re watching proceeds to play a perfect game, getting every question right. The researchers want to know whether each subject is impressed by this. The result is straight out of Pollyanna: people who expect to play with the perfect-game-playing confederate are impressed; the guy must be great, they think. People who expect to play against the confederate are dismissive; they attribute his good performance to luck rather than skill. Same data, different interpretation: both groups of subjects observe someone play a perfect game, but what they make of that observation depends on the role they expect the observed man to play in their own life.
In a similar study, a bunch of college students viewed videos of three people having a conversation; they were asked to judge how likable each of the three was. The subjects were also told (prior to watching the video) that they would be going out on a date with one of those three people (selected at random for each subject). Inevitably, subjects tended to give their highest rating to the person they were told they would be dating — another illustration of how easily our beliefs (in this case, about someone’s likability) can be contaminated by what we wish to believe. In the words of a musical I loved as a child, Harry Nilsson’s The Point!, “You see what you want to see, and you hear want you want to hear. Dig?”
Our tendency to accept what we wish to believe (what we are motivated to believe) with much less scrutiny than what we don’t want to believe is a bias known as “motivated reasoning,” a kind of flip side to confirmation bias. Whereas confirmation bias is an automatic tendency to notice data that fit with our beliefs, motivated reasoning is the complementary tendency to scrutinize ideas more carefully if we don’t like them than if we do. Take, for example, a study in which Kunda asked subjects, half men, half women, to read an article claiming that caffeine was risky for women. In line with the notion that our beliefs — and reasoning — are contaminated by motivation, women who were heavy caffeine drinkers were more likely to doubt the conclusion than were women who were light caffeine drinkers; meanwhile, men, who thought they had nothing at stake, exhibited no such effect.
The same thing happens all the time in the real world. Indeed, one of the first scientific illustrations of motivated reasoning was not a laboratory experiment but a clever bit of real-world fieldwork conducted in 1964, just after the publication of the first Surgeon General’s report on smoking and lung cancer. The Surgeon General’s conclusion — that smoking appears to cause lung cancer — would hardly seem like news today, but at the time it was a huge deal, covered widely by the media. Two enterprising scientists went out and interviewed people, asking them to evaluate the Surgeon General’s conclusion. Sure enough, smokers were less persuaded by the report than were nonsmokers, who pretty much accepted what the Surgeon General had to say. Smokers, meanwhile, came up with all kinds of dubious counterarguments: “many smokers live a long time” (which ignored the statistical evidence that was presented), “lots of things are hazardous” (a red herring), “smoking is better than excessive eating or drinking” (again irrelevant), or “smoking is better than being a nervous wreck” (an assertion that was typically not supported by any evidence).
The reality is that we are just not born to reason in balanced ways; even sophisticated undergraduates at elite universities tend to fall prey to this weakness. One famous study, for example, asked students at Stanford University to evaluate a set of studies on the effectiveness of capital punishment. Some of the students had prior beliefs in favor of capital punishment, some against. Students readily found holes in studies that challenged what they believed but often missed equally serious problems with studies that led to conclusions that they were predisposed to agree with.
Put the contamination of belief, confirmation bias, and motivated reasoning together, and you wind up with a species prepared to believe, well, just about anything. Historically, our species has believed in a flat earth (despite evidence to the contrary), ghosts, witches, astrology, animal spirits, and the benefits of self-flagellation and bloodletting. Most of those particular beliefs are, mercifully, gone today, but some people still pay hard-earned money for psychic readings and séances, and even I sometimes hesitate before walking under a ladder. Or, to take a political example, some 18 months after the 2003 invasion of Iraq, 58 percent of people who voted for George W. Bush still believed there were weapons of mass destruction in Iraq, despite the evidence to the contrary.
And then there is President George W. Bush himself, who reportedly believes that he has a personal and direct line of communication with an omniscient being. Which, as far as his getting elected was concerned, was a good thing; according to a February 2007 Pew Research Center survey, 63 percent of Americans would be reluctant to vote for anyone who doesn’t believe in God.
To critics like Sam Harris (author of the book The End of Faith), that sort of thing seems downright absurd:
To see how much our culture currently partakes of… irrationality… just substitute the names of your favorite Olympian for “God” wherever this word appears in public discourse. Imagine President Bush addressing the National Prayer Breakfast in these terms: “Behind all of life and all history there is a dedication and a purpose, set by the hand of a just and faithful Zeus.” Imagine his speech to Congress (September 20,2001) containing the sentence “Freedom and fear, justice and cruelty have always been at war and we know that Apollo is not neutral between them.”