Выбрать главу

“Yes, very well,” Alvar said, a bit mystified.

“Then I will proceed.” He sat there and watched for about a minute as Donald stood there in front of him, stock-still, frozen in place.

With a resumption of movement that was somehow more disconcerting than the way he had stopped moving, Donald came back to himself. “Very good,” he said to himself. “The first part of my hypothesis is correct. If it had been myself in either situation, I would have been destroyed, killed on the spot.” The satisfaction in his voice was plain.

“Is that all?” Alvar asked, feeling quite confused.

“Oh, no, sir. In a sense, I have not started yet. I was merely establishing a baseline, if you will. Now I must come to the far more difficult part of the experiment. I must put myself in the position of a being of high intellect, with great speed and strength, with superb senses and reflexes, who is placed in the same circumstances. But this hypothetical being is willing and able to defend itself by whatever means, including an attack on a human.”

Alvar gasped and looked up at Donald in shocked alarm. More robots than he cared to recall had been utterly destroyed by far more casual contemplation of harm to humans. To imagine such harm, deliberately committed by oneself, would be the most terrifying, dangerous thought possible for a robot. “Donald, I don’t know if—”

“Sir, I assure you that I understand the dangers far more thoroughly than you do. But I believe the experiment to be essential.”

Before Alvar could protest any further, Donald froze up again. But this time, he did not stay frozen. A series of twitches and tics began to appear, and grew worse and worse. One foot lurched off the ground, and Donald nearly toppled over before he recovered and regained his balance. A strange, high-pitched sound came up from his speaker, sweeping up and down in frequency. The blue glow of his eyes dimmed, flared, and then went blank. His arms, held at his side, twitched. His fingers clenched and unclenched. He seemed about to topple again. Alvar stood up, rushed around his desk, and reached out to steady his old friend, his loyal servant, holding Donald by the shoulders.

Even as he acted, he found that he was astonished with himself. Friend? Loyal servant? He had never even been aware that he thought of Donald that way. But now it quite abruptly seemed possible that he might lose Donald, this moment, and he suddenly knew how deeply he did not want that to happen.

“Donald!” he called out. “Stop! Break off. Whatever it is you are doing, I am ordering you to stop!”

Donald’s body gave another strange twitch, and the robot flinched away from Alvar’s touch, backing away a step or two. His eyes flared up, painfully bright, before regaining their normal appearance. “I—I—thank you, sir. Thank you for calling to me. I do not think that I could have broken free of my own volition.”

“Are you all right? What the hell happened to you?”

“I believe that I am fine, sir, though it might be prudent if I underwent a diagnostic later.” He paused for a moment. “As to what happened, it was a severe cognitive loop-back sequence. I understand that humans are capable of holding two completely contrary viewpoints at once without any great strain. It is not so for robots. I was forced to simulate a lack of constraints on my behavior, although the Three Laws of course control my actions. It was most disconcerting.”

Donald hesitated for a moment and looked at Alvar, his head cocked to one side. “It has never occurred to me just how strange and uncertain, how unguided a thing it must be to be a human being. We robots know our duty, our purpose, our place, our limits. You humans know none of that. How strange to live a life where all things are permitted, whether or not they are possible. If I may be so bold as to ask, sir—how is it humans can cope? What is it they do with all the freedom we robots provide?”

Alvar found himself sorely confused and surprised by the question. Still thrown off guard by Donald’s experiment, he answered with more honesty than he would have permitted in a considered answer. “They waste it,” he said. “They do nothing with their lives, determined to make each day like the last.” He thought of the complaints on his desk, civilians whining that the police had disrupted their lives this morning by trying to capture Caliban, quite unconcerned that the disruption had been in the interests of protecting their lives. “They are sure change can only be for the worse. They battle against change—and so ensure there is no change for the better.”

But then Alvar stopped and turned away from Donald. “Damn it, that’s not fair. Not all of it, anyway. But I spent the morning learning how we’ve doomed ourselves with indolence and denial.”

“My apologies, sir. I did not intend to move the discussion into such irrelevant areas.”

“Irrelevant?” Alvar went back to his desk chair and sat back in it with a sigh. “I think perhaps the questions of change and freedom are very close to the issues in this case. We have looked hard, seeking to find how Fredda Leving was attacked, and who did it. But we have scarcely even stopped to ask ourselves why the blow was struck. I’ll tell you the reason we are bound to find, Donald.” Suddenly his voice was eager, excited. “The reason—the motive—is going to be change, and the fear of it. It’s got to be something mired down in the politics of all this. There is some big change coming, and someone either wants to protect that change—or stop it. That’s what we’re going to find out. But damn it, we have wandered.”

But Alvar had wandered deliberately. He wanted to give Donald a moment to settle down, a chance for his positronic brain to be focused on less frightening, unsettling thoughts for a moment. Alvar knew that the question of a crime’s motive, with the insight it provides into the human psyche, always fascinated Donald. “But your experiment, Donald. What were the results?”

“In brief, sir, it confirmed my initial hypothesis—that a a—being with the physical capabilities of a robot, but with no inhibitions on its behavior, and highly motivated to protect its own existence, could have—ki-killed all the Settlers at the warehouse and all the deputies in the tunnels. And, indeed, doing so would have been safer for this hypothetical being than acting as Caliban did.”

“What are you saying?”

“It would appear that Caliban acted to protect himself, but did not seek to harm humans. Whatever harm came to them was incidental to his self-defense, and perhaps accidental. There is no doubt that he set fire to the warehouse. There is no proof that he did it deliberately.”

“You almost make him sound human, Donald.”

“But sir, as I just observed, there are no constraints on human behavior.”

“Oh, but there are such constraints. Deep, strong constraints, imposed by ourselves and by society. They rarely fail to hold. They do not have the rigid code of the Three Laws imposed from without, but humans learn their own codes of behavior. But let’s not go off on another tangent. I’ve been thinking about the fact that Leving Labs is an experimental facility. We have yet to ask what sort of experiment Caliban was meant to be. What was it that Fredda Leving had in mind? Did the experiment fail? Did it succeed?” A thought came to him, one that made his blood run cold. “Or is the experiment under way now, running exactly according to plan?”

“I don’t understand, sir.”

“Robots come awake for the first time knowing all they need to know. Humans start out in the world knowing nothing of how the world works. Suppose Leving wondered how a robot that had to learn would behave. Suppose that Caliban is out there, behaving in accordance to the Three Laws, but with such a reduced dataset that he does not know, for example, what a human being is. Tonya Welton reminded us that it has happened before. Suppose that Fredda Leving set him out to see how long it would take him to learn the ways of the world on his own.”