Similarly, there is no logical relation between having a capacity to make inferences and having a memory that is prone to errors. In principle, it is entirely possible to have both perfect records of past events and a capacity to make inferences about the future. That’s exactly how computer-based weather-forecasting systems work, for example; they extrapolate the future from a reliable set of data about the past. Degrading the quality of their memory wouldn’t improve their predictions, but rather it would undermine them. And there’s no evidence that people with an especially distortion-prone memory are happier than the rest of us, no evidence that they make better inferences or have an edge at predicting the future. If anything, the data suggest the opposite, since having an above-average memory is well correlated with general intelligence.
None of which is to say that there aren’t compensations. We can, for example, have a great deal of fun with what Freud called “free associations”; it’s entertaining to follow the chains of our memories, and we can put that to use in literature and poetry. If connecting trains of thought with chains of ought tickles your fancy, by all means, enjoy! But would we really and truly be better off if our memory was less reliable and more prone to distortion? It’s one thing to make lemonade out of lemons, another to proclaim that lemons are what you’d hope for in the first place.
In the final analysis, the fact that our ability to make inferences is built on rapid but unreliable contextual memory isn’t some optimal tradeoff. It’s just a fact of history: the brain circuits that allow us to make inferences make do with distortion-prone memory because that’s all evolution had to work with. To build a truly reliable memory, fit for the requirements of human deliberate reasoning, evolution would have had to start over. And, despite its power and elegance, that’s the one thing evolution just can’t do.
3. BELIEF
Alice laughed: “There’s no use trying,” she said; “one can’t be lieve impossible things.”
“I daresay you haven’t had much practice,” said the Queen. “When I was younger, I always did it for half an hour a day. Why, sometimes I’ve believed as many as six impossible things before breakfast.”
“You HAVE A NEED for other people to like and admire you, and yet you tend to be critical of yourself. While you have some personality weaknesses, you are generally able to compensate for them. You have considerable unused capacity that you have not turned to your advantage. Disciplined and self-controlled on the outside, you tend to be worrisome and insecure on the inside.”
Would you believe me if I told you that I wrote that description just for you? It’s actually a pastiche of horoscopes, constructed by a psychologist named Bertram Forer. Forer’s point was that we have a tendency to read too much into bland generalities, believing that they are (specifically) about us — even when they aren’t. Worse, we are even more prone to fall victim to this sort of trap if the bland description includes a few positive traits. Televangelists and late-night infomercials prey upon us in the same way — working hard to sound as if they are speaking to the individual listener rather than a crowd. As a species, we’re only too ready to be fooled. This chapter is, in essence, an investigation of why.
The capacity to hold explicit beliefs that we can talk about, evaluate, and reflect upon is, like language, a recently evolved innovation — ubiquitous in humans, rare or perhaps absent in most other species.[13] And what is recent is rarely fully debugged. Instead of an objective machine for discovering and encoding Truth with a capital T, our human capacity for belief is haphazard, scarred by evolution and contaminated by emotions, moods, desires, goals, and simple self-interest — and surprisingly vulnerable to the idiosyncrasies of memory. Moreover, evolution has left us distinctly gullible, which smacks more of evolutionary shortcut than good engineering. All told, though the systems that underlie our capacity for belief are powerful, they are also subject to superstition, manipulation, and fallacy. This is not trivial stuff: beliefs, and the imperfect neural tools we use to evaluate them, can lead to family conflicts, religious disputes, and even war.
In principle, an organism that trafficked in beliefs ought to have a firm grasp on the origins of its beliefs and how strongly the evidence supports them. Does my belief that Colgate is a good brand of toothpaste derive from (1) my analysis of a double-blind test conducted and published by Consumer Reports, (2) my enjoyment of Colgate’s commercials, or (3) my own comparisons of Colgate against the other “leading brands”? I should be able to tell you, but I can’t.
Because evolution built belief mainly out of off-the-shelf components that evolved for other purposes, we often lose track of where our beliefs come from — if we ever knew — and even worse, we are often completely unaware of how much we are influenced by irrelevant information.
Take, for example, the fact that students rate better-looking professors as teaching better classes. If we have positive feelings toward a given person in one respect, we tend to automatically generalize that positive regard to other traits, an illustration of what is known in psychology as the “halo effect.” The opposite applies too: see one negative characteristic, and you expect all of an individual’s traits to be negative, a sort of “pitchfork effect.” Take, for example, the truly sad study in which people were shown pictures of one of two children, one more attractive, the other less so. The subjects were then told that the child, let’s call him Junior, had just thrown a snowball, with a rock inside it, at another child; the test subjects then were asked to interpret the boy’s behavior. People who saw the unattractive picture characterized Junior as a thug, perhaps headed to reform school; those shown the more attractive picture delivered judgments that were rather more mild, suggesting, for example, that Junior was merely “having a bad day.” Study after study has shown that attractive people get better breaks in job interviews, promotions, admissions interviews, and so on, each one an illustration of how aesthetics creates noise in the channel of belief.
In the same vein, we are more likely to vote for candidates who (physically) “look more competent” than the others. And, as advertisers know all too well, we are more likely to buy a particular brand of beer if we see an attractive person drinking it, more likely to want a pair of sneakers if we see a successful athlete like Michael Jordan wearing them. And though it may be irrational for a bunch of teenagers to buy a particular brand of sneakers so they can “be like Mike,” the halo effect, ironically, makes it entirely rational for Nike to spend millions of dollars to secure His Airness’s endorsement. And, in a particularly shocking recent study, children of ages three to five gave have higher ratings to foods like carrots, milk, and apple juice if they came in McDonald’s packaging. Books and covers, carrots and Styrofoam packaging. We are born to be suckered.
13
Animals often behave as if they too have beliefs, but scientific and philosophical opinion remains divided as to whether they really do. My interest here is the sort of belief that we humans can articulate, such as “On rainy days, it is good to carry an umbrella” or “Haste makes waste.” Such nuggets of conventional wisdom aren’t necessarily true (if you accept “Absence makes the heart grow fonder,” then what about “Out of sight, out of mind”?), but they differ from the more implicit “beliefs” of our sensorimotor system, which we cannot articulate. For example, our sensorimotor system behaves as if it believes that a certain amount of force is sufficient to lift our legs over a curb, but nonphysicists would be hard pressed to say how much force is actually required.) I strongly suspect that many animals have this sort of implicit beliefs, but my working assumption is that beliefs of the kind that we can articulate, judge, and reflect upon are restricted to humans and, at most, a handful of other species.