Выбрать главу

Fredda hesitated a moment, then spoke again. “Besides, there’s something you don’t know. The information from Gubber that you handed to me in the hospital? It was the full police report. I didn’t tell you about it before now because I didn’t think you’d want to know. They have very strong evidence that a robot committed the attack against me. They weren’t ready to believe that evidence before, but now it will be different. And they know a robot named Caliban was involved in a situation with a bunch of robot-bashing Settlers that ended up burning down a building. And there must be more, besides, things that have happened since then. Kresh is not the sort of man to sit still and wait for things to happen. Even if he can’t quite accept the idea of a No Law robot, by now he has a lot more than Horatio’s statement to convince him that Caliban is strange and dangerous. I doubt he’d give up looking even if Caliban loses power and vanishes without a trace.”

“Do you really think Kresh believes Caliban to be dangerous?” Jomaine Terach asked.

Fredda Leving felt an ache in the pit of her stomach and a throbbing pain in her head. It was time to speak truths she had not been able to face. “My point, Jomaine, is that Caliban is dangerous. At least we must work on the assumption that he is. Perhaps he did attack me. You and I know better than anyone else, there was nothing, literally nothing at all, to stop him. Maybe he intends to track me down and finish me off. Who knows?

“Yes, maybe Caliban will simply go into hiding, or vanish into the desert, or malfunction somehow. At first, I was hoping Caliban would allow his power pack to run down, or that he would allow himself to be caught and destroyed before he could get into serious trouble—or reveal his true nature. Those seemed reasonable hopes. After all, he was designed to be a laboratory test robot. We deliberately never programmed him to deal with the outside world. And yet he has survived, somehow, and taught himself enough that he can evade the police.”

“I suppose we can blame Gubber Anshaw for that,” Jomaine said. “The whole idea of the gravitonic brain was that it was to be more flexible and adaptive than overly rigid positronic brains.” Jomaine smiled bleakly, his face dimly visible in the semidarkness of the aircar’s cabin. “Gubber, it seemed, did his job entirely too well.”

“He’s not the only one, Jomaine.” Fredda rubbed her forehead wearily. “You and I did the basal programming on him. We took Gubber’s flexible gravitonic brain and wrote the program that would allow that brain to adapt and grow and learn in our lab tests. It’s just that he stumbled into a slightly larger laboratory than the one we planned.” She shook her head again. “But I had no idea his gravitonic brain would be adaptive enough to survive out there,” she said, speaking not so much to Jomaine as to the dark and open air.

“I don’t understand,” Jomaine said. “You say he’s dangerous, but you sound more like you’re worried about him than frightened of him.”

“I am worried about him,” Fredda said. “I created him, and I’m responsible for him, and I cannot believe he is evil or violent. We didn’t give him Laws that would prevent him from harming people, but we didn’t give him any reason to hurt people either. Half of what we did on the personality coding was compensation for the absence of the Three Laws, making his mind as stable, as well grounded, as we could. And we did our job right. I’m certain of that. He’s not a killer.”

Jomaine cleared his throat gently. “That’s all as may be,” he said. “But there is another factor. Now that we are at last discussing the situation openly, we need to consider the nature of the experiment we planned to perform with Caliban. No matter what else you say about the stability of his personality, or the flexibility of his mind, he was after all built to run one test, designed to answer one question. And when he walked out of your lab, he was primed and ready for that task. He could not help seeking out the answer. He is in all likelihood unaware of what he is looking for, or even that he is looking. But he will be looking, seeking, burning to discover it, even so.”

The aircar eased itself to a halt in midair, then began to sink lower. They had arrived at Jomaine’s house, hard by Leving Labs, close to where it had all begun. The car landed on his roof and the hatch sighed open. The cabin light came gently up. Jomaine stood and reached out to Fredda across the narrow cabin, took her hand and squeezed it. “There is a great deal you have to think about, Fredda Leving. But no one can protect you anymore. Not now. The stakes are far too high. I think you had best start asking yourself what sort of answer Caliban is likely to come up with.”

Fredda nodded. “I understand,” she said. “But remember that you are as deeply involved as I am. I can’t expect you to protect me—but remember, we will sink or swim together.”

“That’s not strictly true, Fredda,” Jomaine said. His voice was quiet, gentle, with no hint of threat or malice. His tone made it clear that he was setting out facts, not trying to scare her. “Remember that you, not I, designed the final programming of Caliban’s brain. I have the documentation to prove it, by the way. Yes, we worked together, and no doubt a court could find me guilty of some lesser charge. But it was your plan, your idea, your experiment. If that brain should prove capable of assault, or murder, the blood will be on your hands, not mine.”

With that, he looked into her eyes for the space of a dozen heartbeats, and then turned away. There was nothing left to say.

Fredda watched Jomaine leave the car, watched the door seal itself, watched the cabin light fade back down to darkness. The aircar lifted itself back up into the sky and she turned her head toward the window. She stared sightlessly out onto the night-shrouded, slow-crumbling glory that was the city of Hades. But then the car swung around, and the Leving Labs building swept across her field of view. Suddenly she saw not nothing, but too much. She saw her own past, her own folly and vaulting ambition, her own foolish confidence. There, in that lab, she had bred this nightmare, raised it on a steady diet of her own disastrous questions.

It had seemed so simple back then. The first New Law robots had passed their in-house laboratory trials. After rather awkward and fractious negotiations, it had been agreed they would be put to use at Limbo. It was a mere question of manufacturing more robots and getting them ready for shipment. That would require effort and planning, yes, but for all intents and purposes, the New Law project was complete insofar as Fredda was concerned. She had time on her hands, and her mind was suddenly free once again to focus on the big questions. Basic, straightforward questions, obvious follow-ons to the theory and practice of the New Law robots.

If the New Laws are truly better, more logical, better suited to the present day, then won’t they fit a robot’s needs more fully? That had been the first question. But more questions, questions that now seemed foolish, dangerous, threatening, had followed. Back then they had seemed simple, intriguing, exciting. But now there was a rogue robot on the loose, and a city enough on edge that riots could happen.

If the New Laws are not best suited to the needs of a robot living in our world, then what Laws would be? What Laws would a robot pick for itself?

Take a robot with a wholly blank brain, a gravitonic brain, without the Three Laws or the New Laws ingrained into it. Imbue it instead with the capacity for Laws, the need for Laws. Give it a blank spot, as it were, in the middle of its programming, a hollow in the middle of where its soul would be if it had a soul. In that place, that blank hollow, give it the need to find rules for existence. Set it out in the lab. Create a series of situations where it will encounter people and other robots, and be forced to deal with them. Treat the robot like a rat in a maze, force it to learn by trial and error.