Выбрать главу

Agreed,the other robots said.

Perhaps I begin to see the moral dilemma here,Lucius said. What if the people on the ground were somewhat less innocent?

How so?Eve asked.

Suppose they in some way deliberately attracted Aranimas, knowing that he was dangerous?

That would be foolish.

Humans often do foolish things. Suppose they did. Would they then deserve their fate?

This is a value judgment,Adam said.

We have been called upon to make one,Lucius replied.

Unfortunately so. Using your logic, then, we would have to conclude that the concept of individual value requires that humans be held responsible for their actions. The inhabitants of the city would therefore be responsible for their own act and thus deserve their fate. If the hostage were truly innocent and the city inhabitants were not, then the city would have to be sacrificed.

I agree,said Lucius. Eve? Mandelbrot?

I agree also,Eve said.

I wish we had never been asked this question,Mandelbrot sent. I reluctantly agree in this specific case, but I still don t believe it answers Ariel s question. What if the death of the innocent hostage merely improved the lives of equally innocent townspeople? To use the Aranimas analogy, what if the hostage-carrying ship aimed at the city were filled with cold virus instead of plutonium?Would it still be acceptable to destroy it?

No,Lucius said. Colds are only an inconvenience except in extremely rare cases.

A worse disease. then. One that cripples but does not kill.

How crippling? How widespread would the effects be? Would food production suffer and thus starve people later? Would the survivors die prematurely of complications brought about by bitterness at their loss? We must know these things as well in order to make a decision.

Then we must give a qualified answer,said Mandelbrot.

Yes. Wish me luck,Lucius said.

Perhaps two seconds had passed while the dialog went on. Aloud, Lucius said to Ariel, “We have considered three specific cases. In the case of a city in mortal peril, if the person in question were not completely innocent in the matter, but the rest of the city’ s inhabitants were, then the person would have to be sacrificed. However, if the person were completely innocent but the city inhabitants were not, then the city’s welfare could not take precedence in any condition up to and including the death of the entire city population. Bear in mind that a single innocent occupant of the city would change the decision. In the last case, where an innocent person’s death would only benefit the quality of life in the city, we have not reached a conclusion. We believe it would depend upon how significant the quality change would be, but such change would have to threaten the long-term viability of the populace before it would even be a consideration. “

Perhaps the hostage should be consulted in such a case,Eve sent.

“Indeed. Perhaps the hostage should be consulted in such a case.”

“But not the townspeople?” Ariel asked.

Lucius used the comlink again. Comment?

If time allowed polling the populace, then it would allow removing them from the danger,Mandelbrot pointed out.

Good point.“Probably not,” Lucius said. “It would of course depend upon the individual circumstances.”

Ariel did not look pleased. Lucius was sure she would now order him dismantled, killed to protect the hypothetical inhabitants of her hypothetical city from his improper judgment. He waited for the blast, but when she spoke it wasn’t at all what he expected.

“Frost, maybe it wasn’t a fair question at that. I don’t know what I d do in that last case. “

“You don’t?”

“No.”

“Then there is no correct answer?”

“I don’t know. Maybe not.”

Janet was smiling. “We were more worried about a wrong answer anyway. “

“I see.”

Wolruf cleared her throat in a loud, gargling growl. “One last ‘ypothetical question,” she said. “W’at if the particular ‘umans in this city didn’t care about the death of an individual. Say it didn’t matter even to the individual. W’at if it wasn’t part of their moral code? Would you enforce yours on them?”

Lucius suddenly knew the exact meaning of the cliche, “Out of the frying pan into the fire.” Help! he sent over the comlink.

The correct answer is “No,”Mandelbrot sent without hesitation.

You are sure?

Absolutely. Thousands of years of missionary work on Earth and another millennium in space have answered that question definitively. One may persuade by logic, but to impose a foreign moral code by force invariably destroys the receiving civilization. Often the backlash of guilt destroys the enforcing civilization as well. Also, it can be argued that even persuading by logic is not in the best interest of either civilization, as that leads to a loss of natural diversity which is unhealthy for any complex, interrelated system such as a society.

How do you know this?

I read over Ariel s shoulder.

Janet heard both Ariel and Wolruf sigh in relief when Lucius said the single word, “No.”

She laughed, relieved herself. “You’re very certain of that,” she said.

“Mandelbrot is certain,” Lucius said. “I trust his judgment.”

Mandelbrot. That name. She could hardly believe it, but it had to be

“I think I trust his judgment, too.” Janet turned to Ariel. “What about you, dear? Satisfied?”

Ariel was slow to answer, but when she did it was a nod. “For now,” she said. “I don’t know if having a learning machine for a mayor will solve everything, but it might solve some of it.”

“Who wants them to solve everything?” Janet asked. “If they did, then we’d really have problems.”

That seemed to mollify Ariel considerably. She nodded and said, “Yeah, well, that’s something to think about, all right. “

No one seemed inclined to carry the discussion any further. Wolruf and Ariel exchanged glances but didn’t speak. The robots all held that particular stiff posture they got when they were using their comlinks. Now that he had removed Basalom’s shoulder joint, Derec was holding the two sections of arm together to see how easy they would be to repair.

Janet turned her attention to Mandelbrot. She looked him up and down, noticing that while most of him was a standard Ferrier model, his right arm was the dianite arm of an Avery robot.

Mandelbrot suddenly noticed her attention and asked, “Madam?”

“Let me guess; you got your name all of a sudden, with no explanation, and had a volatile memory dump at the same time, all when you made a shape-shift with this arm. “

“That is correct,” Mandelbrot said. “You sound as if you know why.”

“I do.” Janet giggled like a little girl. “Oh dear. I just never thought I’d see the result of it so many years later.”

She looked to Derec, then to Ariel, then to Wolruf. “Have you ever thrown a bottle into an ocean with a message inside, just to see if it ever gets picked up?”

Derec and Ariel shook their heads, but Wolruf nodded and said, “Several times.”

Janet smiled her first genuine smile for Wolruf. Maybe she wasn’t so alien after all. She said, “Mandelbrot was a bottle cast in the ocean. And maybe an insurance policy. I don’t know. When I left Wendell, I took all the development notes for the robot cells I’d created with me. I took most of the cells, too, but I knew he’d eventually duplicate the idea and use it for his robots, so since he was going to get it anyway, I left a sample behind in a comer of the lab and made it look like I’d just forgotten it in my hurry. But I altered two of the cells I left behind. I made them sterile, so it would just be those two cells no matter how many copies he made of them, but programmed into each one I left instructions set to trigger after they registered a thousand shape-changes. One was supposed to dump the robot’s onboard memories and change its name to Mandelbrot, and the other was supposed to reprogram it to drop whatever it was doing and track me down wherever I’d gone.”