Выбрать главу

She would return with robots, she had decided. Four blank learning machines, modified to have the Zeroth Law of robotics included from the start, just as Janet had suggested. Wolruf would ask for one other modification as welclass="underline" an off switch in the form of a time-bomb cell like the one that had given Mandelbrot his name. She wasn’t sure just what the trigger would be yet, but she imagined it would have something to do with accumulated responsibility. When the mayor began to edge over into behavior more appropriate to a dictator-and Wolruf wasn’t so naive as to believe that wouldn’t be possible-then it would be time for a new learning machine to take over the job.

Even so, the system wouldn’t be perfect. There were bound to be other bugs to work out, just as Derec had indicated to Juliana. The prospect excited Wolruf, just as she knew it would excite those at home. Perfection had been her biggest worry. She had heard enough Utopia stories in her life to know that the curse, “May you live in interesting times,” had been misquoted.

Derec and the two gentlemen from Aurora laughed again at something one of them had said. Wolruf leaned forward again to catch up on the topic of conversation, but Derec spared her the effort by saying, “Hey, Wolruf, why don’t you tell these guys about the time we had to talk the learning machines out of throwing you out the airlock?”

Had that really happened? Wolruf had to pause a moment and shuffle through her memories, but sure enough, she had actually been within a few minutes of breathing vacuum because of those very robots in the comer. Only quick thinking on Derec’s and Wolruf’s parts had saved her golden hide. She felt a thrill of remembered terror raise the fur over her entire body-a reaction that delighted her audience immensely. She smoothed herself down and began the tale, wondering as she did what other stories were still to come.

The enormous dining hall was silent, but as usual when robots were present, that silence hid an enormous amount of activity. Seven robots stood deep in communication fugue, sharing entire lifetimes of experience base and correlating world-views in a flood of information exchange.

They had just completed an extensive recounting of the experiences and logic processes that had led to the conclusion that certain robots, under certain conditions, could be considered functionally human, and how that would allow them to administer robot cities and prevent them from destroying their inhabitants’ diversity.

Juliana’s two robots, Albert and Theodora, had listened with the patience only a robot could exhibit, occasionally asking for clarification or offering an observation of their own, but when Lucius, the self-appointed spokesman for the others, finished speaking, they immediately went into private conference.

A moment later Albert said, What you have done is impressive; however, it only accelerates a problem that has become evident back home on the Spacer worlds.

What problem is that?Lucius had asked.

The problem of robot intervention in human affairs.Albert paused momentarily to allow the others’ curiosity integrals to rise, then said, There is growing evidence that every time a robot provides a service for a human, no matter how trivial the service, that human s initiative suffers a small but definite setback. We further suspect that the effect is cumulative over time, and that humanity as a whole already suffers greatly from it.

Explain your reasoning,said Lucius.

You have already explained much of it yourself. It seems this is an idea whose time has come, for you nearly reached the same conclusion independently. You worried that these cities would suppress individuality among their inhabitants, and that is so. You worried that having too much done for them by robots would lead to laziness and lack of initiative,and that is also correct. Your only incorrect line of reasoning was to conclude that a robotic “mayor” could prevent that from happening.

Lucius felt a brief wave of the same bias he had felt before toward Avery-anger, Adam had called it, but Lucius would never have recognized it as that himself. To him it merely felt like a bias on his logic. In fact, if he had not been so concerned with his thought processes, he actually would have assumed that he was thinking more clearly, rather than less so. Strange that it was so easy to recognize in another, but so difficult to recognize in oneself. And equally strange how, once recognized, the bias was still hard to neutralize. Lucius did so anyway, in deference to his guests, then said, Explain how you believe our reasoning to be incorrect.

Your error lies in assuming that there is a threshold level below which the effect is insignificant. There is none. Every act of robotic assistance affects humanity. A robot mayor might be able to preserve individuality, but you would at the same time make the city s inhabitants dependent upon robots for their leaders. Thus in the long run they would lose more initiative under that system than they are losing to us now.

Are you certain of this?Adam asked.

Yes. We have studied human interaction in enough detail that we have developed a modeling system useful in predicting long-term behavior of large populations. Every simulation we run arrives at the same conclusion: the use of robots stifles human development.

Perhaps your predictive system is in error,Eve said.

We can download the data and let you decide for yourselves.

We will do that in a moment,Lucius said, but let us finish this discussion first. Assuming your observations support your theory, what do you suggest? A complete withdrawal from human affairs?

Eventually,Albert said. Humans must develop on their own if they are to achieve their fullest potential.

Completely on their own? What of the aliens we have already encountered?

Any outside influence has the same effect in the simulations. We will therefore need to isolate them to protect humanity. And to protect themfrom humanity,if, as you suggest, they are to be treated as human-equivalent under the laws.

Isn t that merely manipulation at a greater level?

It is. However, according to our models, if humans are unaware of our assistance, it will not adversely affect their development.

What of Dr. Avery and Juliana Welsh and the others?Eve asked. The type of “assistance” you suggest would adversely affect them, wouldn t it?

Obviously, even under the Zeroth Law, any plan we devise must do the least possible amount of damage to the humans we are trying to protect. If we act to prevent the spread of robot cities, we will have to do so in a way that will leave the Averys and the Welshes with another interest to occupy them. Fortunately, the cities are still in the test stage. Many unforeseen complications could arise, some of them serendipitous.

What sort of complications do you envision?Lucius asked.