Выбрать главу

Another idiosyncrasy of language, considerably more subtle, has to do with words like some, every, and most, known to linguists as “quantifiers” because they quantify, answering questions like “How much?” and “How many?”: some water, every boy, most ideas, several movies.

The peculiar thing is that in addition to quantifiers, we have another whole system that does something similar. This second system traffics in what linguists call “generics,” somewhat vague, generally accurate statements, such as Dogs have four legs or Paperbacks are cheaper than hardcovers. A perfect language might stick only to the first system, using explicit quantifiers rather than generics. An explicitly quantified sentence such as Every dog has four legs makes a nice, strong, clear statement, promising no exceptions. We know how to figure out whether it is true. Either all the dogs in the world have four legs, in which case the sentence is true, or at least one dog lacks four legs, in which case the sentence is false — end of story. Even a quantifier like some is fairly clear in its application; some has to mean more than one, and (pragmatically) ought not to mean every.

Generics are a whole different ball game, in many ways much less precise than quantifiers. It’s just not clear how many dogs have to have four legs before the statement Dogs have four legs can be considered true, and how many dogs would have to exhibit three legs before we’d decide that the statement is false. As for Paperbacks are cheaper than hardcovers, most of us would accept the sentence as true as a general rule of thumb, even if we knew that lots of individual paperbacks (say, imports) are more expensive than many individual hardcovers (such as discounted bestsellers printed in large quantities). We agree with the statement Mosquitoes carry the West Nile virus, even if only (say) 1 percent of mosquitoes carry the virus, yet we wouldn’t accept the statement Dogs have spots even if all the dalmatians in the world did.

Computer-programming languages admit no such imprecision; they have ways of representing formal quantifiers ([DO THIS THING REPEATEDLY UNTIL EVERY DATABASE RECORD HAS BEEN EXAMINED]) but no way of expressing generics at all. Human languages are idiosyncratic — and verging on redundant — inasmuch as they routinely exploit both systems, generics and the more formal quantifiers.

Why do we have both systems? Sarah-Jane Leslie, a young Princeton philosopher, has suggested one possible answer. The split between generics and quantifiers may reflect the divide in our reasoning capacity, between a sort of fast, automatic system on the one hand and a more formal, deliberative system on the other. Formal quantifiers rely on our deliberative system (which, when we are being careful, allows us to reason logically), while generics draw on our ancestral reflexive system. Generics are, she argues, essentially a linguistic realization of our older, less formal cognitive systems. Intriguingly, our sense of generics is “loose” in a second way: we are prepared to accept as true generics like Sharks attack bathers or Pit bulls maul children even though the circumstances they describe are statistically very rare, provided that they are vivid or salient — just the kind of response we might expect from our automatic, less deliberative system.

Leslie further suggests that generics seem to be learned first in childhood, before formal quantifiers; moreover, they may have emerged earlier in the development of language. At least one contemporary language (Piraha, spoken in the Amazon Basin) appears to employ generics but not formal quantifiers. All of this suggests one more way in which the particular details of human languages depend on the idiosyncrasies of how our mind evolved.

For all that, I doubt many linguists would be convinced that language is truly a kluge. Words are one thing, sentences another; even if words are clumsy, what linguists really want to know about is syntax, the glue that binds words together. Could it be that words are a mess, but grammar is different, a “near-perfect” or “optimal” system for connecting sound and meaning?

In the past several years, Noam Chomsky, the founder and leader of modern linguistics, has taken to arguing just that. In particular, Chomsky has wondered aloud whether language (by which he means mainly the syntax of sentences) might come close “to what some super-engineer would construct, given the conditions that the language faculty must satisfy.” As linguists like Tom Wasow and Shalom Lappin have pointed out, there is considerable ambiguity in Chomsky’s suggestion. What would it mean for a language to be perfect or optimal? That one could express anything one might wish to say? That language is the most efficient possible means for obtaining what one wants? Or that language was the most logical system for communication anyone could possibly imagine? It’s hard to see how language, as it now stands, can lay claim to such grand credentials. The ambiguity of language, for example, seems unnecessary (as computers have shown), and language works in ways neither logical nor efficient (just think of how much extra effort is often required in order to clarify what our words mean). If language were a perfect vehicle for communication, infinitely efficient and expressive, I don’t think we would so often need “paralinguistic” information, like that provided by gestures, to get our meaning across.

As it turns out, Chomsky actually has something different in mind. He certainly doesn’t think language is a perfect tool for communication; to the contrary, he has argued that it is a mistake to think of language as having evolved “for” the purposes of communication at all. Rather, when Chomsky says that language is nearly optimal, he seems to mean that its formal structure is surprisingly elegant, in the same sense that string theory is. Just as string theorists conjecture that the complexity of physics can be captured by a small set of basic laws, Chomsky has, since the early 1990s, been trying to capture what he sees as the superficial complexity of language with a small set of laws.[33] Building on that idea, Chomsky and his collaborators have gone so far as to suggest that language might be a kind of “optimal solution… [to] the problem of linking the sensory-motor and conceptual-intentional systems” (or, roughly, connecting sound and meaning). They suggest that language, despite its obvious complexity, might have required only a single evolutionary advance beyond our inheritance from ancestral primates, namely, the introduction of a device known as “recursion.”

Recursion is a way of building larger structures out of smaller structures. Like mathematics, language is a potentially infinite system. Just as you can always make a number bigger by adding one (a trillion plus one, a googleplex plus one, and so forth), you can always make a sentence longer by adding a new clause. My favorite example comes from Maxwell Smart on the old Mel Brooks TV show Get Smart: “Would you believe that I know that you know that I know that you know where the bomb is hidden?” Each additional clause requires another round of recursion.

There’s no doubt that recursion — or something like it — is central to human language. The fact that we can put together one small bit of structure (the man) with another (who went up the hill) to form a more complex bit of structure (the man who went up the hill) allows us to create arbitrarily complex sentences with terrific precision (The man with the gun is the man who went up the hill, not the man who drove the getaway car). Chomsky and his colleagues even have suggested that recursion might be “the only uniquely human component of the faculty of language.”

вернуться

33

Although I have long been a huge fan of Chomsky’s contributions to linguistics, I have serious reservations about this particular line of work. I’m not sure that elegance really works in physics (see Lee Smolin’s recent book The Trouble with Physics), and in any case, what works for physics may well not work for linguistics. Linguistics, after all, is a property of biology — the biology of the human brain — and as the late Francis Crick once put it, “In physics, they have laws; in biology, we have gadgets.” So far as we know, the laws of physics have never changed, from the moment of the big bang onward, whereas the details of biology are constantly in flux, evolving as climates, predators, and resources, change. As we have seen so many times, evolution is often more about alighting on something that happens to work than what might in principle work best or most elegantly; it would be surprising if language, among evolution’s most recent innovations, was any different.