Выбрать главу

Yet imperfections of the mind have rarely been discussed in the context of evolution. Why should that be? My guess is that there are at least two reasons. The first, plain and simple, is that many of us just don’t want human cognition to turn out to be less than perfect, either because it would be at odds with our beliefs (or fondest desires) or because it leads to a picture of humankind that we find unattractive. The latter factor arises with special force in scientific fields that try to characterize human behavior; the more we stubbornly deviate from rationality, the harder it is for mathematicians and economists to capture our choices in neat sets of equations.

A second factor may stem from the almost mystifying popularity of creationism, and its recent variant, intelligent design. Few theories are as well supported by evidence as the theory of evolution, yet a large portion of the general public refuses to accept it. To any scientist familiar with the facts — ranging from those garnered through the painstaking day-to-day studies of evolution in the contemporary Galapagos Islands (described in Jonathan Weiner’s wonderful book The Beak of the Finch) to the details of molecular change emerging from the raft of recently completed genomes — this continued resistance to evolution seems absurd.[54] Since so much of it seems to come from people who have trouble accepting the notion that well-organized structure could have emerged without forethought, scientists often feel compelled to emphasize evolution’s high points — the cases of well-organized structure that emerged through sheer chance.

Such emphasis has led to a great understanding of how a blind process like evolution can produce systems of tremendous beauty — but at the expense of an equally impassioned exploration of the illuminating power of imperfection. While there is nothing inherently wrong in examining nature’s greatest hits, one can’t possibly get a complete and balanced picture by looking only at the highlights.

The value of imperfections extends far beyond simple balance, however. Scientifically, every kluge contains a clue to our past; wherever there is a cumbersome solution, there is insight into how nature layered our brain together; it is no exaggeration to say that the history of evolution is a history of overlaid technologies, and kluges help expose the seams.

Every kluge also underscores what is fundamentally wrongheaded about creationism: the presumption that we are the product of an all-seeing entity. Creationists may hold on to the bitter end, but imperfection (unlike perfection) beggars the imagination. It’s one thing to imagine an all-knowing engineer designing a perfect eyeball, another to imagine that engineer slacking off and building a half-baked spine.

There’s a practical side too: investigations into human idiosyncrasy can provide a great deal of useful insight into the human condition; as they say in Alcoholics Anonymous, recognition is the first step. The more we can understand our clumsy nature, the more we can do something about it.

When we look at imperfections as a source of insight, the first thing to realize is that not every imperfection is worth fixing. I’ve long since come to terms with the fact that my calculator is better than I am at solving square roots, and I see little point in cheering for Garry Kasparov over his computer opponent, Deep Blue, in the world chess championships. If computers can’t beat us now at chess and Trivial Pursuit, they will someday soon. John Henry’s fin-de-siècle Race Against the Machine was noble but, in hindsight, a lost cause. In many ways machines have (or eventually will have) the edge, and we might as well accept it. The German chemist Ernst Fischer mused that “as machines become more and more efficient and perfect, so it will become clear that imperfection is the greatness of man.” A creature designed by an engineer might never know love, never enjoy art, never see the point of poetry. From the perspective of brute rationality, time spent making and appreciating art is time that could be “better” spent gathering nuts for winter. From my perspective, the arts are part of the joy of human existence. By all means, let us make poetry out of ambiguity, song and literature out of emotion and irrationality.

That said, not every quirk of human cognition ought to be celebrated. Poetry is good, but stereotyping, egocentrism, and our species-wide vulnerability to paranoia and depression are not. To accept everything that is inherent to our biological makeup would be to commit a version of the “naturalistic fallacy,” confusing what is natural with what is good. The trick, obviously, is to sort through our cognitive idiosyncrasies and decide which are worth addressing and which are worth letting go (or even celebrating).

For example, it makes little sense to worry about ambiguity in everyday conversation because we can almost always use context and interaction to figure out what our conversational partners have in mind. It makes little sense to try to memorize the phone numbers of everyone we know because our memory just isn’t built that way (and now we have cell phones to do that for us). For much of our daily business, our mind is more than sufficient. It generally keeps us well-fed, employed, away from obstacles, and out of harm’s reach. As much as I envy the worry-free life of the average domesticated cat, I wouldn’t trade my brain for Fluffy’s for all the catnip in China.

But that doesn’t mean that we can’t, as thinkers, do even better. In that spirit, I offer, herewith, 13 suggestions, each founded on careful empirical research:

1. Whenever possible, consider alternative hypotheses. As we have seen, we humans are not in the habit of evaluating evidence in dispassionate and objective ways. One of the simplest things we can do to improve our capacity to think and reason is to discipline ourselves to consider alternative hypotheses. Something as simple as merely forcing ourselves to list alternatives can improve the reliability of reasoning.

One series of studies has shown the value of the simple maxim “Consider the opposite”; another set has shown the value of “counterfactual thinking” — contemplating what might have been or what could be, rather than focusing on what currently is.

The more we can reflect on ideas and possibilities other than those to which we are most attached, the better. As Robert Rubin (Bill Clinton’s first treasury secretary) said, “Some people I’ve encountered in various phases of my career seem more certain about everything than I am about anything.” Making the right choice often requires an understanding of the road not traveled as well as the road ultimately taken.

2. Reframe the question. Is that soap 99.4 percent pure or 0.6 percent toxic? Politicians, advertisers, and even our local supermarket staff routinely spin just about everything we hear, see, and read. Everything is presented to be as positive as possible. Our job — as consumers, voters, and citizens — must be to perpetually cast a skeptical eye and develop a habit of rethinking whatever we are asked. (Should I construe this “assisted suicide” legislation as an effort to protect people from murderous doctors or as a way of allowing folks to die with dignity? Should I think about the possibility of reducing my hours to part-time work as a pay cut or as an opportunity to spend more time with my kids?) If there’s another way to think about a problem, do it. Contextual memory means that we are always swimming upstream: how we think about a question invariably shapes what we remember, and what we remember affects the answers we reach. Asking every question in more than one way is a powerful way to counter that bias.

вернуться

54

It is sometimes said, pejoratively, that evolution is “just a theory,” but this statement is true only in the technical sense of the word theory (that is, evolution is an explanation of data), not in the lay sense of being an idea about which there is reasonable doubt.