Выбрать главу

“What of it?”

“Well, obviously, in the task of remaking a planet, there are going to be accidents. There are going to be people displaced from their homes, people who suffer in floods and droughts and storms deliberately produced by the actions and orders of these two control systems. They will, inevitably, cause some harm to some humans somewhere.”

“I thought that the system had been built to endure that sort of First Law conflict. I’ve read about systems that dealt with large projects and were programmed to consider benefit or harm to humanity as a whole, rather than to individuals.”

Soggdon shook her head. “That only works in very limited or specialized cases—and I’ve never heard of it working permanently. Sooner or later, robotic thinking machines programmed to think that way can’t do it anymore. They burn out or fail in any of a hundred ways—and the cases you’re talking about are robots who were expected to deal with very distant, abstract sorts of situations. Unit Dee has to worry about an endless series of day-by-day decisions affecting millions of individual people—some of whom she is dealing with directly, talking to them, sending and receiving messages and data. She can’t think that way. She can’t avoid thinking about people as individuals.”

“So what is the solution?” Kresh asked.

Soggdon took a deep breath and then went on, very quickly, as if she wanted to get it over with as soon as possible. She raised her hand and made a broad, sweeping gesture. “Unit Dee thinks this is all a simulation,” she said.

“What?” Kresh said.

“She thinks that the entire terraforming project, in fact the whole planet of Inferno, is nothing more than a very complex and sophisticated simulation set up to learn more in preparation for a real terraforming project some time in the future.”

“But that’s absurd!” Kresh objected. “No one could believe that.”

“Well, fortunately for us all, it would seem that Unit Dee can.”

“But there’s so much evidence to the contrary! The world is too detailed to be a simulation!”

“We limit what she can see, and know, very carefully,” Soggdon replied. “Remember, we control all of her inputs. She only receives the information we give her. In fact, sometimes we deliberately introduce spurious errors, or send her images and information that don’t quite make sense. Then we correct the ‘mistakes’ and move on. It makes things seem less real—and also establishes the idea that things can go wrong. That way when we do make mistakes in calculations, or discover that we’ve overlooked a variable, or have just plain let her see something she shouldn’t have, we can correct it without her getting suspicious. She thinks Inferno is a made-up place, invented for her benefit. So far as she knows, she is actually in a laboratory on Baleyworld. She thinks the project is an attempt to learn how to interact with Settler hardware for future terraforming projects.” Soggdon hesitated for a moment, and then decided she might as well give him the worst of the bad news all at once. “In fact, Governor, she believes that you are part of the simulation.”

“What!”

“It was necessary, believe me. If she thought you were a real person, she would of course wonder what you were doing in the made-up world of her simulation. We have to work very hard to make her believe the real world is something we have made up for her.”

“And so you had to tell her that I did not really exist.”

“Precisely. From her point of view, sapient beings are divided into three groups—one, those who exist in the real world, but don’t have anything to do with her; two, real-world people here in the lab and in the field who talk with her and interact with her—and three, simulants, simulated intelligences.”

“Simulants,” Kresh said, very clearly not making it into a question. He was ordering her to explain the term, not asking her to do so.

“Ah, yes, sir. That’s the standard industry term for the made-up humans and robots placed in a simulation. Unit Dee believes that the entire population of Inferno is really nothing more than a collection of simulants—and you are a member of that population.”

“Are you trying to tell me I can’t talk to her because she’ll realize that I’m not made-up?” Kresh asked.

“Oh, no, sir! There should be no problem at all in your talking with Unit Dee. She talks every day with ecological engineers and field service robots and so on. But she believes them all to be doing nothing more than playing their parts. It is essential that she believe the same thing about you.”

“Or else she’ll start wondering if her simulated reality is actually the real world, and start wondering if her actions have caused harm to humans,” said Kresh.

“She has actually caused the death of several humans already,” Soggdon replied. “Unavoidably, accidentally, and only to save other humans at other times and places. She has dealt reasonably well with those incidents—but only because she thought she was dealing with simulants. And, I might add, she does have a tendency to believe in her simulants, to care about them. They are they only world she’s ever known.”

“They are the only world there is,” said Kresh. “Her simulants are real-life people.”

“Of course, of course, but my point is that she knows they are imaginary, and yet has begun to believe in them. She believes in them in the way one might care about characters in a work of fiction, or the way a pet owner might talk to her nonsentient pet. On some level Unit Dee knows her simulants are not real. But she still takes a genuine interest in them, and still experiences genuine, if mild, First Law conflict when one of them dies and she might, conceivably, have prevented it. Causing the death of simulants has been extremely difficult for her.

“If she were to find out she had been killing real people—well, that would be the end. She might simply experience massive First Law conflict and lock up altogether, suffer brainlock and die. Or worse, she might survive.”

“Why would it be worse if she survived?” Kresh asked.

Soggdon let out a long weary sigh and shook her head. She looked up at the massive hemisphere and shook her head. “I don’t know. I can guess. At best, I think she would find ways to shut down the whole operation. We’d try to stop her, of course, but she’s too well hooked in, and she’s awfully fast. I expect she’d order power shutdowns, find some way to deactivate Unit Dum so he couldn’t run the show on his own, erase computer files—that sort of thing. She’d cancel the reterraforming project because it could cause injury to humans.”

“The best sounds pretty bad. And at worst?”

“At worst, she would try to undo the damage, put things back the way they were.” Soggdon allowed herself a humorless smile. “She’d set to work trying to un-reterraform the planet. Galaxy alone knows what that would end up like. We’d shut her down, of course, or at least try to do so. But I don’t need to exaggerate the damage she could do.”

Kresh nodded thoughtfully. “No, you don’t,” he said. “But I still need to talk with her—and with Unit Dum. You haven’t said much about him, I notice.”

Soggdon shrugged. “There’s not much to say. I suppose we shouldn’t even call him a he—he’s definitely an it, a soulless, mindless, machine that can do its job very, very, well. When you speak with him, you’ll really be dealing with his pseudoself-aware interface, a personality interface—and, I might add, it is quite deliberately not a very good one. We don’t want to fool ourselves into thinking Unit Dum is something he is not.”