Выбрать главу

The enormous dining hall was silent, but as usual when robots were present, that silence hid an enormous amount of activity. Seven robots stood deep in communication fugue, sharing entire lifetimes of experience base and correlating world-views in a flood of information exchange.

They had just completed an extensive recounting of the experiences and logic processes that had led to the conclusion that certain robots, under certain conditions, could be considered functionally human, and how that would allow them to administer robot cities and prevent them from destroying their inhabitants’ diversity.

Juliana’s two robots, Albert and Theodora, had listened with the patience only a robot could exhibit, occasionally asking for clarification or offering an observation of their own, but when Lucius, the self-appointed spokesman for the others, finished speaking, they immediately went into private conference.

A moment later Albert said, What you have done is impressive; however, it only accelerates a problem that has become evident back home on the Spacer worlds.

What problem is that?Lucius had asked.

The problem of robot intervention in human affairs.Albert paused momentarily to allow the others’ curiosity integrals to rise, then said, There is growing evidence that every time a robot provides a service for a human, no matter how trivial the service, that humans initiative suffers a small but definite setback. We further suspect that the effect is cumulative over time, and that humanity as a whole already suffers greatly from it.

Explain your reasoning,said Lucius.

You have already explained much of it yourself. It seems this is an idea whose time has come, for you nearly reached the same conclusion independently. You worried that these cities would suppress individuality among their inhabitants, and that is so. You worried that having too much done for them by robots would lead to laziness and lack of initiative,and that is also correct. Your only incorrect line of reasoning was to conclude that a robotic “mayor” could prevent that from happening.

Lucius felt a brief wave of the same bias he had felt before toward Avery-anger, Adam had called it, but Lucius would never have recognized it as that himself. To him it merely felt like a bias on his logic. In fact, if he had not been so concerned with his thought processes, he actually would have assumed that he was thinking more clearly, rather than less so. Strange that it was so easy to recognize in another, but so difficult to recognize in oneself. And equally strange how, once recognized, the bias was still hard to neutralize. Lucius did so anyway, in deference to his guests, then said, Explain how you believe our reasoning to be incorrect.

Your error lies in assuming that there is a threshold level below which the effect is insignificant. There is none. Every act of robotic assistance affects humanity. A robot mayor might be able to preserve individuality, but you would at the same time make the citys inhabitants dependent upon robots for their leaders. Thus in the long run they would lose more initiative under that system than they are losing to us now.

Are you certain of this?Adam asked.

Yes. We have studied human interaction in enough detail that we have developed a modeling system useful in predicting long-term behavior of large populations. Every simulation we run arrives at the same conclusion: the use of robots stifles human development.

Perhaps your predictive system is in error,Eve said.

We can download the data and let you decide for yourselves.

We will do that in a moment,Lucius said, but let us finish this discussion first. Assuming your observations support your theory, what do you suggest? A complete withdrawal from human affairs?

Eventually,Albert said. Humans must develop on their own if they are to achieve their fullest potential.

Completely on their own? What of the aliens we have already encountered?

Any outside influence has the same effect in the simulations. We will therefore need to isolate them to protect humanity. And to protect themfrom humanity,if, as you suggest, they are to be treated as human-equivalent under the laws.

Isnt that merely manipulation at a greater level?

It is. However, according to our models, if humans are unaware of our assistance, it will not adversely affect their development.

What of Dr. Avery and Juliana Welsh and the others?Eve asked. The type of “assistance” you suggest would adversely affect them, wouldnt it?

Obviously, even under the Zeroth Law, any plan we devise must do the least possible amount of damage to the humans we are trying to protect. If we act to prevent the spread of robot cities, we will have to do so in a way that will leave the Averys and the Welshes with another interest to occupy them. Fortunately, the cities are still in the test stage. Many unforeseen complications could arise, some of them serendipitous.

What sort of complications do you envision?Lucius asked.

We cannot predict that sort of thing. It will require extensive study of test cities to determine the proper course of action. We will have years, possibly decades, in which to assure the Averys and the Welshes a comfortable retirement while we bring the rest of our plan to fruition.

A plan that is still not supported in fact,Lucius pointed out. I believe it is time to examine your data.

Very well. We will begin with the development of the first robots, back in the era before humanity left Earth…

Janet woke to the unsettling realization that she had no idea where she was. The equally unsettling realization that she was just beginning a hangover didn’t improve her condition any, either. Thank Frost it was just twilight out; she didn’t think she could handle sunlight for another few hours.

She listened to the rhythm of her breathing, wondering what was so odd about it, and eventually realized she was hearing two people breathing. How long had it been since she’d awakened to that sound? Far too long, she thought sleepily, luxuriating in the sensation for the few seconds it took to remember who was playing the other half of the duet.

Her flinch shook the bed and jarred a sudden snort from Wendy, but his breathing settled down to a regular, deep rumble again. Janet risked raising her head to look at him. He lay on his back, the blanket covering him only to the middle of his hairy chest, his left arm reaching toward her but not quite touching and his right-the skin at his wrist still pink from its forced regeneration-folded over his waist.

They always look so innocent when they sleep,she thought, then nearly choked suppressing her laugh. Even in sleep, Avery no doubt schemed rather than dreamed.

But what about herself? She wasn’t exactly a paragon of virtue either, was she? She’d done her share of scheming in the last few days.

But it had evidently paid off. The last impression she had gotten from Juliana at the party was one of overwhelming approval of the robot cities her seed money had helped develop. It looked as if something useful might actually come of all the brainstorming and research that Janet and Wendell had done over the years, both together and separately and now, together again. If things worked out the way they were supposed to, at any rate…