Выбрать главу

The trouble with extending this “Shoot first, ask questions later” approach to belief is that the linguistic world is much less trustworthy than the visual world. If something looks like a duck and quacks like a duck, we are licensed to think it’s a duck. But if some guy in a trenchcoat tells us he wants to sell us a duck, that’s a different story. Especially in this era of blogs, focus groups, and spin doctors, language is not always a reliable source of truth. In an ideal world, the basic logic of perception (gather information, assume true, then evaluate if there is time) would be inverted for explicit, linguistically transmitted beliefs; but instead, as is often the case, evolution took the lazy way out, building belief out of a progressive overlay of technologies, consequences be damned. Our tendency to accept what we hear and read with far too little skepticism is but one more consequence.

Yogi Berra once said that 90 percent of the game of baseball was half mental; I say, 90 percent of what we believe is half cooked. Our beliefs are contaminated by the tricks of memory, by emotion, and by the vagaries of a perceptual system that really ought be fully separate — not to mention a logic and inference system that is as yet, in the early twenty-first century, far from fully hatched.

The dictionary defines the act of believing both as “accepting something as true” and as “being of the opinion that something exists, especially when there is no absolute proof.” Is belief about what we know to be true or what we want to be true? That it is so often difficult for members of our species to tell the difference is a pointed reminder of our origins.

Evolved of creatures that were often forced to act rather than think, Homo sapiens simply never evolved a proper system for keeping track of what we know and how we’ve come to know it, uncontaminated by what we simply wish were so.

4. CHOICE

People behave sometimes as if they had two selves, one who wants clean lungs and long life and another who adores tobacco, one who yearns to improve himself by reading Adam Smith on self-command (in The Theory of Moral Sentiments) and another who would rather watch an old movie on television. The two are in continual contest for control.

— THOMAS SCHELLINC

IN THE LATE 1960s and early 1970s, in the midst of the craze for the TV show Candid Camera (forerunner of YouTube, reality TV, and shows like America’s Funniest Home Videos), the psychologist Walter Mischel offered four-year-old preschoolers a choice: a marshmallow now, or two marshmallows if they could wait until he returned. And then, cruelly, he left them alone with nothing more than themselves, the single marshmallow, a hidden camera, and no indication of when he would return. A few of the kids ate the oh-so-tempting marshmallow the minute he left the room. But most kids wanted the bigger bonus and endeavored to wait. So they tried. Hard. But with nothing else to do in the room, the torture was visible. The kids did just about anything they could to distract themselves from the tempting marshmallow that stood before them: they talked to themselves, bounced up and down, covered their eyes, sat on their hands — strategies that more than a few adults might on occasion profitably adopt. Even so, for about half the kids, the 15 or 20 minutes until Mischel returned was just too long to wait.

Giving up after 15 minutes is a choice that could only really make sense under two circumstances: (1) the kids were so hungry that having the marshmallow now could stave off true starvation or (2) their prospects for a long and healthy life were so remote that the 20-minute future versions of themselves, which would get the two marshmallows, simply weren’t worth planning for. Barring these rather remote possibilities, the children who gave in were behaving in an entirely irrational fashion.

Toddlers, of course, aren’t the only humans who melt in the face of temptation. Teenagers often drive at speeds that would be unsafe even on the autobahn, and people of all ages have been known to engage in unprotected sex with strangers, even when they are aware of the risks. The preschoolers’ marshmallows have a counterpart in my raspberry cheesecake, which I know I’ll regret later but nevertheless want desperately now. If you ask people whether they’d rather have a certified check for $100 that they can cash now, or a check for twice as much that they can’t cash for three years, more than half will take the $100 now. (Curiously— and I will come back to this later — most people’s preferences reverse when the time horizon is lengthened, preferring $200 in nine years to $100 in six years.) Then there are the daily uncontrollable choices made by alcoholics, drug addicts, and compulsive gamblers. Not to mention the Rhode Island convict who attempted to escape from jail on day 89 of a 90-day prison sentence.

Collectively, the tendencies I just described exemplify what philosophers call “weakness of the will,” and they are our first hint that the brain mechanisms that govern our everyday choices might be just as klugey as those that govern memory and belief.

Wikipedia defines Homo economicus, or Economic man, as the assumption, popular in many economic theories, that man is “a rational and self-interested actor who desires wealth, avoids unnecessary labor, and has the ability to make judgments towards those ends.”

At first glance, this assumption seem awfully reasonable. Who among us isn’t self-interested? And who wouldn’t avoid unnecessary labor, given the chance? (Why clean your apartment unless you know that guests are coming?)

But as the architect Mies van der Rohe famously said, “God is in the details.” We are indeed good at dodging unnecessary labor, but true rationality is an awfully high standard, frequently well beyond our grasp. To be truly rational, we would need, at a minimum, to face each decision with clear eyes, uncontaminated by the lust of the moment, prepared to make every decision with appropriately dispassionate views of the relevant costs and benefits. Alas, as we’ll see in a moment, the weight of the evidence from psychology and neuroscience suggests otherwise. We can be rational on a good day, but much of the time we are not.

Appreciating what we as a species can and can’t do well — when we are likely to make sound decisions and when we are likely to make a hash of them — requires moving past the idealization of economic man and into the more sticky territory of human psychology. To see why some of our choices appear perfectly sensible and others perfectly foolish, we need to understand how our capacity for choice evolved.

I’ll start with good news. On occasion, human choices can be entirely rational. Two professors at NYU, for example, studied what one might think of as the world’s simplest touch-screen video game — and found that, within the parameters of that simple task, people were almost as rational (in the sense of maximizing reward relative to risk) as you could possibly imagine. Two targets appear (standing still) on a screen, one green, one red. In this task, you get points if you touch the green circle; you lose a larger number of points if you touch the red one. The challenge comes when the circles overlap, as they often do, and if you touch the intersection between the circles, you get both the reward and the (larger) penalty, thus accruing a net loss. Because people are encouraged to touch the screen quickly, and because nobody’s hand-eye coordination is perfect, the optimal thing to do is to point somewhere other than the center of the green circle. For example, if the green circle overlaps but is to the right of the red circle, pointing to the center of the green circle is risky business: an effort to point at the exact center of the green circle will sometimes wind up off target, left of center, smack in the point-losing region where the green and red circles overlap. Instead, it makes more sense to point somewhere to the right of the center of the green circle, keeping the probability of hitting the green circle high, while minimizing the probability of hitting the red circle. Somehow people figure all this out, though not necessarily in an explicit or conscious fashion. Even more remarkably, they do so in a manner that is almost perfectly calibrated to the specific accuracy of their own individual system of hand-eye coordination. Adam Smith couldn’t have asked for more.