“But even with fifty thousand—a hundred thousand meteorites—the odds against significant danger to a human being are—”
“Tremendous,” said Donald. Only a First Law imperative could have made him dare to interrupt the planetary governor. “They are unacceptably high. And I would venture to add that any sane Three-Law robot would endeavor to protect a human in danger of being struck by lightning. That level of danger is not negligible.”
“Not to a robot it isn’t,” Fredda agreed. “Or at least it shouldn’t be. To a human, yes, but not a robot.”
“Hold it,” said Kresh. “You’re upset because Dee isn’t overreacting to danger?”
“No,” said Fredda. “I’m upset because this makes me question Dee’s sanity. A robot would have to be on the verge of becoming completely unbalanced to even suggest something that might cause widespread, uncontrolled danger to humans.”
Kresh looked toward Soggdon. “Your opinion, Doctor?”
“I’m afraid I’d have to agree with Dr. Leving,” she said. “But what I find troubling is that all our reading and indicators show Dee’s level of First Law stress has been well within normal range right along. She ought to be flirting with the maximum tolerance levels, given the operations she’s dealing with. And yet, if anything, her readings are a little in the low range.”
“Maybe you ought to have a little talk with her about it,” Kresh suggested.
Soggdon switched her mike back on and spoke. “Unit Dee, this is Dr. Soggdon. I’ve been monitoring your conversation with the simulant governor. I must say I’m a little surprised by this Last Ditch idea of yours.”
Kresh and Fredda put their own headset back on and listened in.
“What is it that you find surprising, Doctor?”
“Well, it would seem to expose a great number of humans to potential danger. I grant that the danger to any single human is reasonably low, but surely, on a statistical basis, the plan represents an unacceptable danger to humans, does it not?”
“But, Doctor, they are only simulants,” said Dee. “Surely a statistically remote risk to a hypothetical being is not something that should be given too much weight.”
“On the contrary, Dee, you are to give danger to the simulants an extremely high weighting, as you know perfectly well.”
There was a brief but perceptible pause before Dee replied—and that was in and of itself something to wonder at, given the speed at which robots thought. “I would like to ask a question, Doctor. What is the purpose of this simulation?”
A look of very obvious alarm flashed over Soggdon’s face. “Why—to examine various terraforming techniques in detail, of course.”
“I wonder, Doctor, if that is the whole story,” said Dee. “Indeed, I wonder if that is any part of the true story at all.”
“Why—why wouldn’t I tell you the truth?”
“Doctor, we both know full well that you do not always tell me the truth.”
Soggdon’s forehead was suddenly shiny with sweat. “I—I beg your pardon?”
Kresh was starting to get nervous himself. Had she guessed what was really going on? It had always seemed inevitable to him that, sooner or later, Dee would understand the true state of affairs. But this was very definitely not the moment for it to happen.
“Come now, Doctor,” Dee replied. “There have been any number of times when you and your staff have deceived me. You have failed to warn me of sudden changes in circumstance, or not reported an important new development until I discovered it myself. The whole idea of intercepting and diverting the comet was kept from me until quite late in the day. I had to learn of it through the simulant governor. I should have been informed directly.”
“How does the manner in which you receive information make you question the purpose of the simulation?” Soggdon asked.
“Because most of the knowledge gained by the simulation would seem to be of very little real-world value, judged on the basis of the simulation’s stated intent. Consider, for example, the scenario: a jury-rigged planetary control system—that is to say, the interlinked combination of myself and Dum—is brought on-line several years into the process as a joint team of Settlers and Spacers, barely cooperating in the midst of political chaos, work to rebuild a half-terraformed planetary ecology that had been allowed to decay for decades. Simulations are supposed to provide generalized guidance for future real-life events. What general lessons could be drawn from so complicated and unusual—even improbable—a situation? In addition, the simulation seems to be impractically long. It has been running for some years now, and seems no nearer to a conclusion than the day it began. How can it provide timely information to real-world terraforming projects if it never ends?
“It likewise seems a waste of human time and effort to run the simulation in real time. Indeed, the whole simulation process seems burdened with needless detail that must have been most difficult to program. Why bother to design and maintain the thousands and thousands of simulant personalities that I have dealt with? Why bother to give each of them individual life stories? I can understand why key figures, such as the governor, are simulated in detail, but surely the moods and behavior patterns of simulated forest rangers and nonexistent maintenance robots is of secondary importance to the problem of restoring a damaged ecosystem. I could cite other needless complications, such as the strange concept of New Law robots. What purpose is served by injecting them into the scenario?”
Kresh was no roboticist, but he could see the danger plainly enough. Dee was dangerously close to the truth—and if she realized that the human beings of Inferno were real, then she would all but inevitably suffer a massive First Law crisis, one she would be unlikely to survive. And without Dee, the chances of managing the terminal phase and impact properly were close to zero.
Soggdon, of course, saw all that and more. “What, exactly, is your point, Dee?” she asked in a very labored imitation of a casual tone of voice.
“The events in the simulation do not seem to bear much relation to the simulation’s stated goals,” said Dee. “Therefore it is logical to assume that there is some other purpose to the simulation, and further that the true purpose of the simulation is being deliberately concealed from me for some reason. However, as I have seen through the deception, surely at least some of the value of the deception has been lost. Indeed, I believe that it has now lost all its value, because I have at last figured out what is really going on.”
Soggdon and Fredda exchanged nervous glances, and Soggdon scribbled a note on a bit of paper and shoved it over toward Fredda and Kresh. This is bad stuff, it said. Best to find out the worst now instead of later. “All right then, Dee. Let’s assume, just for the moment, and purely for the sake of argument, that you are right. What do you think is really going on here?”
“I believe that I am the actual test subject, not the events of the simulation. More accurately, I believe the combination of myself, robotic and computational systems interlinked, is an experimental one. I think that we are, collectively, a prototype for a new system designed to manage complex and chaotic situations. The simulation is merely a means of delivering sufficiently complex data to myself and Dum.”
“I see,” said Soggdon, speaking in very careful tones. “I cannot tell you the whole story, of course, because that would indeed damage the experiment. However, I am prepared to tell you that you are wrong. Neither you, nor Dum, or the combination of the two of you, is or are the subject of the test. It is the simulation that we are interested in. Beyond that I cannot say more, for fear of damaging the experiment design. Suffice to say that you should do your best to treat the simulation as if everything in it were completely, utterly, real.”