The line between positive and negative mutations isn’t always clear, however. Some mutations do terrible harm to those who have them and yet they flourish because they also provide a benefit that outweighs the harm. The classic example can be found in West Africa, where about 10 percent of the population carries a genetic mutation that causes sickle-cell anemia—a disease that, without modern medical intervention, is likely to kill the victim before adolescence. Ordinarily, natural selection would quickly eliminate this mutation. It hasn’t because the mutation isn’t always deadly. Only if a child is unlucky enough to get the mutant gene from both parents does it cause sickle-cell anemia. If she gets it from only one parent, it will instead boost the child’s resistance to malaria—a disease that routinely kills children younger than five and that is rife all over West Africa. So the mutation kills in some circumstances and saves lives in others. As a result, natural selection has spread the mutation in the West African population, but only up to a certain level—because beyond that, more children would get the mutation from both parents and then it would take more lives than it saves.
Most people get this as far as physical traits go. The opposable thumb is mighty useful. Thank you, natural selection. And we also have no trouble talking this way about the brains and behavior of other species. Why do chimpanzee mothers nurture and protect their young? Simple: Natural selection favored this behavior and, in time, it became hardwired into chimp brains.
But the moment this conversation turns to human brains and actions, people get uncomfortable. The idea that much human thought is unconscious, and that evolutionary hardwiring is its foundation, is too much for many to accept. “I am not willing to assume,” wrote David Brooks, the New York Times columnist, “that our brains are like computers. . . . Isn’t it just as possible that the backstage part of the brain [meaning unconscious thought] might be more like a personality, some unique and nontechnological essence that cannot be adequately generalized about by scientists in white coats with clipboards?” What Brooks is saying here is what many of us vaguely sense: that the brain is a big, complex, physical organ at the center of which is some indefinable thing or entity that makes decisions and issues commands for reasons scientists in white coats will never be able to fathom.
For this, we can thank René Descartes. Even those who have never heard of the French philosopher have imbibed his idea that body and mind are separate. The mind is not merely a lump of gray matter on our shoulders. It contains something we vaguely refer to as spirit, soul, or “nontechnological essence,” to use Brooks’s strange term. In 1949, three centuries after Descartes, philosopher Gilbert Ryle scornfully dubbed this idea “the ghost in the machine.” In the almost six decades since, science has made enormous progress in understanding how humans think, and everything we have learned supports Ryle. There is no ghost, no spirit, no nontechnological essence. There is only the brain, and the brain is entirely physical. It was and is subject to the same pressures of natural selection that gave us the opposable thumb and sickle-cell anemia.
This is not to denigrate the brain; quite the opposite. The human brain is magnificent. We have to give it credit for everything our species has accomplished—from surviving and multiplying to putting a man on the moon and unlocking the secrets of the universe and even the brain itself— because, truth be told, we humans are the scrawny, four-eyed nerds in nature’s schoolyard. Our senses of sight, smell, and hearing were never as good as those of the animals we wanted to catch and eat. Our arms, legs, and teeth were always puny compared to the muscles and fangs of the predators who competed with us for food and occasionally looked at us as lunch.
The brain was our only advantage. It alone kept us from becoming nature’s Edsel. Relying on it so heavily, the dimmer among us lost out to the smarter. The brain developed new capabilities. And it got bigger and bigger. Between the time of our earliest hominid ancestors and the first appearance of modern man, it quadrupled in mass.
This radical transformation happened even though having huge brains caused serious problems. Housing them required skulls so large that when they passed through a woman’s pelvis during childbirth, they put the life of the mother and her baby in peril. They made our heads so heavy that humans were put at much greater risk of broken necks than chimpanzees and other primates. They sucked up one-fifth of the body’s entire supply of energy. But as serious as these drawbacks were, they were outweighed by the advantages humans got from having an onboard supercomputer. And so big brains were selected and the species survived.
The transformation of the human brain into its modern form occurred entirely during the “Old Stone Age”—the Paleolithic era that lasted from roughly two million years ago until the introduction of agriculture some 12,000 years ago. Not that the advent of agriculture suddenly transformed how most people lived. It took thousands of years for the new way of life to spread, and it was only 4,600 years ago that the first city—not much more than a town by modern standards—was founded.
If the history of our species were written in proportion to the amount of time we lived at each stage of development, two hundred pages would be devoted to the lives of nomadic hunter-gatherers. One page would cover agrarian societies. The world of the last two centuries—the modern world— would get one short paragraph at the end.
Our brains were simply not shaped by life in the world as we know it now, or even the agrarian world that preceded it. They are the creation of the Old Stone Age. And since our brains really make us what we are, the conclusion to be drawn from this is unavoidable and a little unsettling. We are cavemen. Or cavepersons, if you prefer. Whatever the nomenclature, we sophisticated moderns living in a world of glass, steel, and fiber optics are no different, in a fundamental sense, than the prehistoric humans for whom campfires were the latest in high tech and bison hides were haute couture.
This is the central insight of evolutionary psychology—a field that came into prominence only in the last thirty years, although Darwin himself saw the implications of evolution for the study of human thoughts and actions. Our minds evolved to cope with what evolutionary psychologists call the “Environment of Evolutionary Adaptation.” If we wish to understand the workings of the mind today, we have to first examine the lives of ancient humans on the savannas of Africa.
Of course, the full truth is a little more complicated than this. For one thing, the brain that our oldest human ancestors had was a hand-me-down from earlier species. Human experience later rewired some of what was inherited and added greatly to it, but much of that original, prehuman brain remained. It’s still there today, in the amygdala and other structures of what is sometimes called the reptilian brain, or even less elegantly, the lizard brain.
It’s also true that not all of the Paleolithic history of ancient humans was spent hunting gazelles and dodging lions on the golden plains of Africa. Our ancestors were wanderers who moved from one strange land to another. So there wasn’t one “Environment of Evolutionary Adaptation.” There were many. And that meant humans and their giant brains had to learn and adapt. Flexibility became a quintessential human trait: The same brain that figured out how to chip flint into an arrowhead also learned how to keep warm in cold climates by stripping other animals of their hides and how to ensure a supply of breathable oxygen on the moon.