Выбрать главу

"Yes," Bobby said, "it does take three people, El Rooto, because it's complicated."

"Why? And don't call me El Rooto."

"I obey, Mr. Root."

"Just get on with it…"

"Well," Bobby said, "I started to check the sensors after this morning's episode, and it looks to me like they're miscalibrated. But since nobody is going outside, the question is whether we're reading them wrong, or whether the sensors themselves are faulty, or just scaled wrong on the equipment in here. Mae knows these sensors, she's used them in China. I'm making code revisions now. And Charley is here because he won't go away and leave us alone."

"Shit, I have better things to do," Charley said. "But I wrote the algorithm that controls the sensors, and we need to optimize the sensor code after they're done. I'm just waiting until they stop screwing around. Then I'll optimize." He looked pointedly at Bobby. "None of these guys can optimize worth a damn."

Mae said, "Bobby can."

"Yeah, if you give him six months, maybe."

"Children, children," Ricky said. "Let's not make a scene in front of our guest." I smiled blandly. The truth was, I hadn't been paying attention to what they were saying. I was just watching them. These were three of my best programmers-and when they had worked for me, they had been self-assured to the point of arrogance. But now I was struck by how nervous the group was. They were all on edge, bickering, jumpy. And thinking back, I realized that Rosie and David had been on edge, too.

Charley started humming in that irritating way of his.

"Oh, Christ," Bobby Lembeck said. "Would you tell him to shut up?"

Ricky said, "Charley, you know we've talked about the humming."

Charley continued to hum.

"Charley…"

Charley gave a long, theatrical sigh. He stopped humming.

"Thank you," Bobby said.

Charley rolled his eyes, and looked at the ceiling.

"All right," Ricky said. "Finish up quickly, and get back to your stations."

"Okay, fine."

"I want everybody in place as soon as possible."

"Okay," Bobby said.

"I'm serious. In your places."

"For Christ's sake, Ricky, okay, okay. Now will you stop talking and let us work?" Leaving the group behind, Ricky took me across the floor to a small room. I said, "Ricky, these kids aren't the way they were when they worked for me."

"I know. Everybody's a little uptight right now."

"And why is that?"

"Because of what's going on here."

"And what is going on here?"

He stopped before a small cubicle on the other side of the room. "Julia couldn't tell you, because it was classified." He touched the door with a keycard. I said, "Classified? Medical imaging is classified?"

The door latch clicked open, and we went inside. The door closed behind us. I saw a table, two chairs, a computer monitor and a keyboard. Ricky sat down, and immediately started typing. "The medical imaging project was just an afterthought," he said, "a minor commercial application of the technology we are already developing."

"Uh-huh. Which is?"

"Military."

"Xymos is doing military work?"

"Yes. Under contract." He paused. "Two years ago, the Department of Defense realized from their experience in Bosnia that there was enormous value to robot aircraft that could fly overhead and transmit battlefield images in real time. The Pentagon knew that there would be more and more sophisticated uses for these flying cameras in future wars. You could use them to spot the locations of enemy troops, even when they were hidden in jungle or in buildings; you could use them to control laser-guided rocket fire, or to identify the location of friendly troops, and so on. Commanders on the ground could call up the images they wanted, in the spectra they wanted-visible, infrared, UV, whatever. Real-time imaging was going to be a very powerful tool in future warfare."

"Okay…"

"But obviously," Ricky said, "these robot cameras were vulnerable. You could shoot them down like pigeons. The Pentagon wanted a camera that couldn't be shot down. They imagined something very small, maybe the size of a dragonfly-a target too small to hit. But there were problems with power supply, with small control surfaces, and with resolution using such a small lens. They needed a bigger lens."

I nodded. "And so you thought of a swarm of nanocomponents."

"That's right." Ricky pointed to the screen, where a cluster of black spots wheeled and turned in the air, like birds. "A cloud of components would allow you to make a camera with as large a lens as you wanted. And it couldn't be shot down because a bullet would just pass through the cloud. Furthermore, you could disperse the cloud, the way a flock of birds disperses with a gunshot. Then the camera would be invisible until it re-formed again. So it seemed an ideal solution. The Pentagon gave us three years of DARPA funding."

"And?"

"We set out to make the camera. It was of course immediately obvious that we had a problem with distributed intelligence."

I was familiar with the problem. The nanoparticles in the cloud had to be endowed with a rudimentary intelligence, so that they could interact with each other to form a flock that wheeled in the air. Such coordinated activity might look pretty intelligent, but it occurred even when the individuals making up the flock were rather stupid. After all, birds and fish could do it, and they weren't the brightest creatures on the planet.

Most people watching a flock of birds or a school of fish assumed there was a leader, and that all the other animals followed the leader. That was because human beings, like most social mammals, had group leaders.

But birds and fish had no leaders. Their groups weren't organized that way. Careful study of flocking behavior-frame-by-frame video analysis-showed that, in fact, there was no leader. Birds and fish responded to a few simple stimuli among themselves, and the result was coordinated behavior. But nobody was controlling it. Nobody was leading it. Nobody was directing it.

Nor were individual birds genetically programmed for flocking behavior. Flocking was not hard-wired. There was nothing in the bird brain that said, "When thus-and-such happens, start flocking." On the contrary, flocking simply emerged within the group as a result of much simpler, low-level rules. Rules like, "Stay close to the birds nearest you, but don't bump into them." From those rules, the entire group flocked in smooth coordination. Because flocking arose from low-level rules, it was called emergent behavior. The technical definition of emergent behavior was behavior that occurred in a group but was not programmed into any member of the group. Emergent behavior could occur in any population, including a computer population. Or a robot population. Or a nanoswarm.

I said to Ricky, "Your problem was emergent behavior in the swarm?"

"Exactly."

"It was unpredictable?"

"To put it mildly."

In recent decades, this notion of emergent group behavior had caused a minor revolution in computer science. What that meant for programmers was that you could lay down rules of behavior for individual agents, but not for the agents acting together. Individual agents-whether programming modules, or processors, or as in this case, actual micro-robots-could be programmed to cooperate under certain circumstances, and to compete under other circumstances. They could be given goals. They could be instructed to pursue their goals with single-minded intensity, or to be available to help other agents. But the result of these interactions could not be programmed. It just emerged, with often surprising outcomes.

In a way this was very exciting. For the first time, a program could produce results that absolutely could not be predicted by the programmer. These programs behaved more like living organisms than man-made automatons. That excited programmers-but it frustrated them, too. Because the program's emergent behavior was erratic. Sometimes competing agents fought to a standstill, and the program failed to accomplish anything. Sometimes agents were so influenced by one another that they lost track of their goal, and did something else instead. In that sense the program was very childlike-unpredictable and easily distracted. As one programmer put it, "Trying to program distributed intelligence is like telling a five-year-old kid to go to his room and change his clothes. He may do that, but he is equally likely to do something else and never return."