Выбрать главу

EXTEL OUTSIDE

PROGRESS MEANS SMOKE ... The human race has certainly made a lot of progress over the years, then. How did we do that? Because we're intelli­gent, we've got brains. Minds, even. But other creatures are intelligent, dolphins, especially. And all they seem to do is enjoy themselves in the sea. What have we got that they haven't?

Many discussions of the mind treat it essentially as a question about the architecture of the brain. The viewpoint is that this deter­mines what brains can do, and then the various things that we associate with minds, the difficult problems of free will, con­sciousness and intelligence, come out of neurophysiology. That's one approach. The other common one is to view the problem through the eyes of a social scientist or an anthropologist. From this viewpoint the mind's capabilities are pretty much taken as 'given', and the main questions are how human culture builds on those capabilities to create minds able to think original thoughts, feel emotions, have concepts like love and beauty, and so on. It may seem that between them these two approaches pretty much cover the territory. Link them, and you have a complete answer to the question of mind.

However, neurophysiology and culture aren't independent: they are 'complicit'. By this we mean that they have evolved together, each changing the other repeatedly, and their mutual coevolution built on the unpredictable results of that ongoing interaction. The view of culture building on, and changing, brains is incomplete, because brains also build on, and change, culture. The concept of complicity captures this recursive, mutual influence.

We call the brain's internal capabilities Intelligence'. It is con­venient to give a similar name to all of the external influences, cultural or otherwise, that affect the evolution of the brain, and with it, the mind. We shall call these influences extelligence, a term that HEX has picked up thanks to once-and-future computing. Mind is not just intelligence plus extelligence, its inside and out­side, so to speak. Instead, mind is a feedback loop in which intelligence influences extelligence, extelligence influences intelli­gence, and the combination transcends the capabilities of both.

Intelligence is the ability of the brain to process information. But intelligence is only part of what is needed to make a mind. And even intelligence is unlikely to evolve in isolation.

Culture is basically a collection of interacting minds. Without individual minds you can't have a culture. The converse is perhaps less obvious, but equally true: without a shared culture, the human mind cannot evolve. The reason is that there is nothing in the envi­ronment of the evolving mind that can drive it towards self-complication, becoming more sophisticated, unless that brain has something else fairly sophisticated to interact with. And the main sophisticated thing around to interact with is minds of other people. So the evolution of intelligence and that of extelli­gence are inextricably linked, and complicity between them is inevitable.

In the world around us are things that we, or other human beings, have created, things which play a similar role to intelli­gence but sit outside us. They are things like libraries, books, and the Internet, which from the viewpoint of exteiligence would be better named the 'Extranet'. The Discworld concept of 'L-space' -library-space, is similar: it's all one thing. These influences, sources not just of information but of meaning, are 'cultural capital'. They are things that people put out into the culture, which can then sit there, or even reproduce, or interact in a way that individuals can't control.

The old artificial intelligence question: 'Can we create an intel­ligent machine?' viewed the machine as a once-off object in its own right. The problem, people assumed, was to get the machine's architecture right, and then program intelligent behaviour into it.

But that's probably the wrong approach. Of course, it is certainly conceivable that the collective extelligence of all the human beings interacting with that machine could put a mind into it, and in par­ticular endow it with intelligence. But it seems much more likely that, unless you had a whole community of machines interacting with each other and evolving, providing the requisite extelligence too, then you wouldn't be actually able to structure the Ant Country of the neural connections of the machine in a way that could gen­erate a mind. So the story of the mind is one of complicity and emergence. Indeed, mind is one of the great examples of complic­ity.

The internal story of the development of the mind can be summed up as a series of steps in which the key 'player' is the nerve cell A nerve cell is an extended object that can send signals from one place to another Once you've got nerve cells you can have net­works of nerve cells; and once you've got networks, then a whole pile of stuff comes along free of charge. For example, there is an area of complexity theory called 'emergent computation'. It turns out that when you evolve a network, randomly chosen networks, arbitrary networks, not constructed with specific purposes, they do things. They do something, which may or may not seem mean­ingful; they do whatever it is that that network does. But you can often look at what that network does, and spot emergent features. You discover that even though its architecture was random, it evolved the ability to compute things. It carries out algorithmic processes (or something close to algorithmic processes). The ability to do calculations, computations, algorithms seems to come free of charge once you've invented devices that send signals from one place to another and react to those signals to send new signals. If you allow evolution you don't have to work hard to create the abil­ity to do some kind of processing.

Once you've got that facility, it's a relatively short step to the ability to do specific kinds of processing that happen to be useful -that happen to offer survival value. All you need is the standard Darwinian selection procedure. Anything that's got that ability sur­vives, anything that hasn't, doesn't. The ability to process incoming information in ways that extract an interesting feature of the out­side world, react to it, and thereby make it easier to evade a predator or to spot food, gets reinforced. The brain's internal architecture comes from a phase space of possible structures, and evolution selects from that phase space. Put those two together and you can evolve structures in the brain that have specific functions. The brain's surroundings certainly influence the development of the brain.

Do animals have minds? They do to some extent, depending on the animal. Even simple animals can have surprisingly sophisticated mental abilities. One of the most surprising is a funny creature called a mantis shrimp.

It's like the shrimps you put inside a sandwich and eat, except that it's about 5 inches (12 cm) long and it's more complex. You can keep a mantis shrimp in a tank, as part of a miniature marine ecol­ogy. If you do, you'll find that mantis shrimps cause havoc. They tend to destroy things, but they also build things. One thing they love building is tunnels, which they then live in. The mantis shrimp is a bit of an architect, and it decorates the front of its tunnel with bits and pieces of things, especially bits and pieces of what it has just killed. Hunting trophies. It doesn't like to have just one tunnel - it's discovered that if you have one tunnel with one entrance, that's more correctly known as a 'trap'. So it likes to have a back entrance too, and more. By the time it's been in the tank for about two months, it's riddled the entire tank with tunnels, and you find it sticking its head out at one end or the other without seeing it pass between.