Fredda left the test meter plugged in and hung it off a utility hook on the side of the maintenance frame. She got in a little closer, adjusted the position of the table slightly, and undid the four clampdown fasteners that held on the back of Kaelor’s head, and carefully lifted the backplate off. She took one look at the circuitry and cabling thus revealed and shook her head. “No,” she said. “I was afraid of that. I’ve seen this setup before.” She pointed to a featureless black ball, about twelve centimeters across. “His positronic brain is in that fully sealed unit. The only link between it and the outside world is that armored cable coming out of its base, where the spinal column would be on a human. That cable will have about five thousand microcables inside, every one of them about the diameter of a human hair. I’d have to guess right on which two of those to link into, and get it right on the first try, or else I would quite literally fry his brain. Short him out. Space alone knows how long it would take to trace the linkages. A week probably. The whole brain assembly is designed to be totally inaccessible.”
“But why?” asked Davlo Lentrall.
Fredda smiled sadly. “To protect the confidential information inside his head. To keep people from doing exactly what we’re trying to do—get information out of him that he would not want to reveal.”
“Damnation! I’d thought we’d just be able to tap into his memory system and extract what we needed.”
“With some robots that might be possible—though incredibly time-consuming,” Fredda said as she reattached the back of Kaelor’s head. “Not with this model.”
“So there’s nothing we can do,” Lentrall said. “I mean, on the level of electronics and memory dumps.” As he spoke, his face was drawn and expressionless, and he seemed unwilling to meet Fredda’s gaze, or to look at Kaelor. He was the portrait of a man who had already decided he had to do something he was not going to be proud of. And the portrait of a man who was going to crack before very much longer.
“Nothing much,” said Fredda.
“So we’re going to have to talk to him—and we know he doesn’t want to talk.”
Fredda wanted to have some reason to disagree, but she knew better. Kaelor would already have spoken up if he had been willing to speak. “No, he doesn’t,” she said. She thought for a moment and picked up her test meter. “The two things I can do is deactivate his main motor control, so he can only move his head and eyes and talk. And I can set his pseudoclock-speed lower.”
“Why cut his main motor function?” Davlo asked.
So he won’t tear his own head off or smash his own brain in to keep us from learning what he wants kept secret, Fredda thought, but she knew better than to tell that to Davlo. Fortunately, it didn’t take her long to think of something else. “To keep him from breaking out and escaping,” she said. “He might try to run away rather than speak to us.”
Davlo nodded, a bit too eagerly, as if he knew better but wanted to believe. “What about the clock speed?” he asked.
“In effect, it will make him think more slowly, cut his reaction time down. But even at its minimum speed settings, his brain works faster than ours. He’ll still have the advantage over us—it’ll just be cut down a bit.”
Davlo nodded. “Do it,” he said. “And then let’s talk to him.”
“Right,” said Fredda, trying to sound brisk and efficient. She used the test meter to send the proper commands through Kaelor’s diagnostic system, then hooked the meter back on to the maintenance frame. She spun the frame around until Kaelor was suspended in an upright position, eyes straight ahead, feet dangling a half meter off the floor. He stared straight ahead, his body motionless, his eyes sightless. The test meter cable still hung from his neck, and the meter’s display showed a series of diagnostic numbers, one after the other, in blinking red.
Seeing Kaelor strapped in that way, Fredda was irresistibly reminded of an ancient drawing she had seen somewhere, of a torture victim strapped down on a frame or rack not unlike the one that held Kaelor now. That’s the way it works, she thought. Strap them down, mistreat them, try and force the information out of them before they die. It was a succinct description of the torturer’s trade. She had never thought before that it might apply to a roboticist as well. “I bet you don’t like this any better than I do,” she said, staring at the robot. She was not sure if she was talking to Kaelor or Davlo.
Now Davlo looked on Kaelor, and could not take his eyes off him. “Yesterday, he grabbed me and stuffed me under a bench and used his body to shield mine. He risked his life for mine. He’d remind me himself that the Three Laws compelled him to do it, but that doesn’t matter. He risked his life for mine. And now we’re simply going to risk his life.” He paused a moment, and then said it in plainer words. “We’re probably about to kill him,” he said in a flat, angry voice. “Kill him because he wants to protect us—all of us—from me.”
Fredda glanced at Davlo, and then looked back at Kaelor. “I think you’d better let me do the talking,” she said.
For a moment she thought he was about to protest, insist that a man ought to be willing to do this sort of job for himself. But instead his shrugged, and let out a small sigh. “You’re the roboticist,” he said, still staring straight at Kaelor’s dead eyes. “You know robopsychology.”
And there are times I wished I knew more human psychology, Fredda thought, giving Davlo Lentrall a sidelong glance. “Before we begin,” she said, “there’s something you need to understand. I know that you ordered Kaelor built to your own specifications. You wanted a Constricted First Law robot, right?”
“Right,” said Lentrall, clearly not paying a great deal of attention.
“Well, you didn’t get one,” Fredda said. “At least not in the sense you might think. And that’s what set up the trap you’re in now. Kaelor was designed to be able to distinguish hypothetical danger or theoretical danger from the real thing. Though most high-function robots built on Inferno are capable of distinguishing between real and hypothetical danger to humans, they in effect choose not to do so. In a sense, they let their imaginations run away with them, worry that the hypothetical might become real, and fret over what would happen in such a case, and treat it as if were real, just to be on the safe side of the First Law. Kaelor was, in effect, built without much imagination—or what passes for imagination in a robot. He is not capable of making that leap, of asking, ‘What if the hypothetical became real?’ ”
“I understand all that,” Davlo said irritably.
“But I don’t think you understand the next part,” Fredda said with more coolness than she felt. “With a robot like Kaelor, when the hypothetical, the imaginary, suddenly does become real, when it dawns on such a robot that it has been working on a project that is real, that poses real risks to real people—well, the impact is enormous. I would compare it to the feeling you might have if you suddenly discovered, long after the fact, that, unbeknownst to yourself, some minor, even trifling thing you had done turned out to cause the death of a close relative. Imagine how hard that would hit you, and you’ll have some understanding of how things felt to Kaelor.”
Davlo frowned and nodded. “I see your point,” he said. “And I suppose that would induce a heightened First Law imperative?”
“Exactly,” Fredda said. “My guess is that, by the time you switched him off, Kaelor’s mental state was approaching a state of First Law hypersensitivity, rendering him excessively alert to any possible danger to humans. Suddenly realizing that he had unwittingly violated First Law already would only make it worse. Once we switch Kaelor back on, he’s going to revert to that state instantly.”