“Yes, sir. Of course.” Donald turned his back on Kresh and Fredda. “I have now shut down my audio receptors.”
“Very good,” said Kresh. More damn fool precautions, but that couldn’t be helped. At least now Donald would be unable to hear or eavesdrop. Now they would be able to talk without fear of saying the wrong thing in front of the robot and accidentally setting up a damn fool First Law crisis. Kresh turned toward Fredda. “What about the Robotic Planetary Control Center?” he asked. “I wanted to consult with it—and with the Computational Planetary Control Center—before I reached a decision.”
“Well, what about them?” Fredda asked.
The two control centers were the heart of the reterraforming effort, performing all the calculations and analyses of each new project before it was launched. The original intent had been to build a single control center. There were two basic designs to choose between. One was a Settler-style computational unit, basically a massively complex and powerful, but quite nonsentient, computer. The other was a Spacer-style robotic unit that would be based on a hugely powerful positronic brain, fully imbued with the Three Laws. It would, in effect, be a robot mind without a robot body.
There had been a tremendous controversy over whether to trust the fate of the planet to a mindless machine, or to a robotic brain that would refuse to take necessary risks. It was easy to imagine a robotic control unit choosing to avoid harm to one human, rather than permit a project vital to the future of the planet. The robotics experts all promised that it didn’t work that way, but experts had been wrong before. Governor Grieg had died before he could reveal his choice between the two systems. In one of the first acts of his administration, Kresh had decided to build both, and interconnect them so that the two systems worked in consensus with each other. In theory, if the two systems could not reach agreement on what to do, or not to do, they were to call in human referees to decide the issue. In practice, the two systems had agreed with each other far more often than anyone could have hoped. Thus far, there had only been a half dozen or so very minor issues that had required human decisions.
A vast planetary network of sensors and probes, orbiting satellites, mobile units, and on-site investigators, both robotic and human, fed a constant stream of information to both units—and both units fed back a constant stream of instructions and commands to the humans and robots and automatic machines in the field.
The two interconnected control centers were the only devices on the planet capable of handling the constant stream of incoming data and outgoing instructions. It was plainly obvious that the two of them would have to be consulted regarding the plan to drop a comet on the planet, but Kresh did not wish to risk the sanity of the robotic unit. “You saw what just happened to Donald,” he said. “Will I burn the Robotic Center out if I ask it what I should do?”
Fredda smiled reassuringly. “There wouldn’t be much point in having a Robotic Control Center that couldn’t consider risks to the planet without damaging itself,” she said. “It took some doing, but we installed some special… safeguards, shall we say, that should keep it from experiencing any serious First Law conflict.”
“Good, good,” said Kresh, a bit absently. “At least that’s one less thing to worry about. At least we know that part is all right.”
“Do we?” Fredda asked. “I wonder. When Lentrall asked me about Donald’s name, and how it was not from Shakespeare, that made me wonder.”
“Wonder what?”
“I was absolutely certain it was from Shakespeare. No doubt at all in my mind. I never bothered to double-check, any more than I would have bothered to double-check the spelling of my own name. I thought I knew it—and I was dead wrong.”
“We all make mistakes,” Kresh said.
“Yes, yes, of course,” Fredda said, impatiently. “But that’s not the point. In a sense, it’s a trivial mistake. But it came out of a trusted database. Who knows how long ago the dataset got scrambled, or what else in it is wrong? And if that database can be wrong, lots of other things can be as well. What else is there that we think we know? What other hard fact that we think we have absolutely right will turn out to be absolutely dead wrong? What else do we think we know?”
There was a moment of long and uncomfortable silence.
But uncertainty surrounded all of life. To wait until one was sure was to remain frozen in place until it was too late. “We’ll never be able to answer that question,” said Kresh. He paused for a moment and thought. “You’re thinking like a scientist,” he said. “And up until now, I’ve been thinking like a politician. Maybe it’s time to think like a police officer.”
“I must admit that I do not see how the police viewpoint would be of much use in this situation,” said Fredda.
“Because back when I was a policeman, I knew I didn’t know,” said Kresh. “I knew, on every case, that some knowledge was hidden, and that I would never have absolutely complete or totally accurate information. But I still had to act. I still had to decide. I had to take the facts I had—or thought I had—and ride them as far as they would take me.” He stepped around Donald so he was facing the robot. He waved his hand in front of Donald’s face. “All right, Donald,” he said. “You can turn around and listen now.”
“Thank you, sir,” Donald replied.
Kresh smiled at Donald, then paused a moment and walked to the center of the room. He looked from Donald to Fredda, and then turned around to look at the rainstorm again, to look at nothing at all. “By the time I know enough to decide what to do, it will be too late to decide. Therefore, we will work on the assumption that we are going to divert Comet Grieg. All preparations will go forward as if we were indeed planning to do the job.”
“So we pretend that you’ve decided?” Fredda asked.
“More or less,” Kresh said. “It will buy me some time. I won’t have to decide until it’s actually time to deflect the comet.”
“That’s a dangerous move,” Fredda said. “It’s going to be hard to make all the investment of time and effort and money and then pull back at the last moment.”
“It’s not the best way to do it,” Kresh agreed. “But can you think of any way that’s less bad? That at least gives us time to examine our options?”
“No,” Fredda admitted.
“Then I think we’d better do it my way,” said Kresh.
“That leaves us with a hell of a lot to do,” Fredda said. “There’s the space-side interception and diversion to set up, the targeting to plan, the site survey of wherever the comet’s going to hit, evacuation of people and equipment, emergency preparations for the cities, food stockpiles to lay in—”
“Excuse me, Dr. Leving, but, if I may say so, that is the sort of organizational job I was made to do.”
Kresh smiled. Fredda ought to know that. She had made Donald in the first place. It was as close to a joke as Donald was ever likely to get. “Point taken,” Kresh said. “Donald, I want you to get started on the initial organizational tasks right now. Project management is to be your primary duty, and you are to avoid allowing other tasks to interfere. You are to perform no further personal service for me unless specifically ordered to do so. Report to me via hyperwave in three hours, time as to project status. Thereafter, you are to consult with me as you see fit. Fredda, with Donald tied up, I’m afraid I’m going to have to borrow Oberon as a pilot. I have a feeling Donald would not permit me to do the flying myself in this weather.”
“Absolutely not,” said Donald.