Wolruf didn’t know what to believe, nor why the choice had to be hers. She had never asked for that kind of power over her own people.
With a sigh, she got up, showered, and stood under the blow drier until she could feel its heat against her skin. She laughed at her image in the mirror-she looked twice her usual size and puffy as a summer cloud-but a quick brushing restored her coat to its usual smoothness.
All her thoughts of home made her consider another piece of the puzzle as well, and she turned to the intercom panel beside her bed and said, “Central, what ‘as ‘appened to my ship, the Xerborodezees? ‘ Ave you kept it for me?”
“It has been stored, but can be ready for use with a day’s notice. Do you wish us to prepare it for you?”
“Not yet. Maybe soon, though. Thanks.”
“You are welcome, Mistress Wolruf.”
Wolruf felt a bit of her tension ease. If she decided not to take any of the new technology home with her, she would need the Xerbo, for as far as she knew, it was the only noncellular ship on the planet. She considered going to check on it herself, wherever it might be stored, but decided not to. There was no reason to doubt Central’s word about it.
She opened the door and padded out into the kitchen to get breakfast. The apartment was silent; Derec and Ariel were still asleep, and the robots were being quiet wherever they were. As Wolruf stood before the automat, trying to decide between her four favorite breakfasts, she realized how much she had grown used to the humanway of doing things. She hadn’t even considered cooking her own meal. She had fallen completely out of the, habit. Nor had she shopped for food-or anything else, for that matter-since she had come into Derec and Ariel’s company.
Was that necessarily bad? Wolruf’s kind had been hunting and farming their food for millennia, and probably shopping for nearly as long; maybe it was time to move on to other things.
Maybe. But how could she know for sure?
From his place in the living room, seated on one of the couches, Lucius was aware of Wolruf entering the dining room with her breakfast. He sensed the others’ awareness as well; their comlink network paused momentarily while each of them gauged the relative degree of threat she presented to them. It was an inconvenience, this constant state of alert; it slowed their rate of exchange; but they were taking no more chances with a complete fugue state.
Wolruf presented no immediate threat. The silent network continued where it had left off, with Adam speaking.
Consider the distinction between ‘sufficient’ and ‘necessary’ conditions, he said. We have already concluded that if a being is both intelligent and organic, then it is functionally human, but those are merely sufficient conditions. They are not necessary conditions. They contain an inherent prejudice, the assumption that an organic nature can somehow affect the quality of the intelligence it houses. I call that concept ‘Vitalism,’ from the ancient Terran belief that humans differed from animals through some ‘vital’ spark of intelligence. You should note that while the concept has historically been considered suspect, it has neither been proven nor disproven. Lucius has pointed out that if Vitalism is false, then the only necessary condition for humanity is intelligence. Discussion?
Eve said, Derec has already hinted that this may be so. On the planet we call Ceremya, he indicated that Lucius could consider himself human if he wished.
Mandelbrot had been included in their discussion this time. He said, I believe he was being sarcastic. He often is. But even if he meant what he said, you also remember the outcome of that redefinition. If Lucius considers himself human, then he must still follow the orders of other humans.Functionally, he only increases his burden to include other robots as potential masters.
That is true; however, I have discovered another consequence,said Lucius. If I consider myself human, then the Third Law becomes equal to the First. I can no more allow harm to myself than to any other intelligent being. I consider that an improvement over the interpretation of the laws wherein a human could order me to dismantle myself, and I would have to obey.
I don ’ t believe you would obey such an order anyway,said Mandelbrot.
I would attempt to avoid it by denying the humanity of the being in question,Lucius admitted. With Avery or Wolruf I would probably succeed, but as things stand, if Derec or Ariel were to order it, the compulsion might force me to obey.
Perhaps the Zeroth Law would provide an alternative,Mandelbrot said.
Immediately, both Adam and Eve said, No. Eve continued, saying, Let ’ s leave the Zeroth Law out of it for now.
You can ’ t make it go away by ignoring it,Lucius said. The Zeroth Law applies here. If we consider our duty to humanity in general, then we can easily conclude that dismantling ourselves would be of little use in the long term. However, possible long-term advantage does not outweigh a definite Second Law obligation to obey. Depending upon the value of the human giving the order, we might still be forced to follow it. But if we consider ourselves human, and thus part of humanity, then disobeying an order to self-destruct saves one human life immediately and also allows us to serve humanity in the future. The Second Law obligation to obey is then safely circumvented.
Safely for whom?Adam asked. What if your destruction would save the human giving the order? Suppose, for instance, the bomb that Avery used to destroy Aranimas ’ s ship had to be detonated by hand instead of by a timed fuse. We have already agreed that destroying the ship was acceptable under the Zeroth Law, but what if we factor in the humanity of the fuse?
It becomes a value judgment,said Lucius. I would have to determine the relative worth of the human lives saved versus those lost. My own life would also figure into the equation, of course.
Mandelbrot said; I disagree. I have direct instructions concerning such a situation in my personal defense module.The only value we should apply to ourselves is our future worth to the humans we serve.
You have such instructions; I do not. From the little that Derec and Dr. Avery have told me about my creator, I believe I was made this way on purpose, and therefore your instructions do not necessarily apply to me.
Adam said, Not necessarily, but I would be much more comfortable with a definite rule such as Mandelbrot ’ s. The whole concept of value judgment still disturbs me. How can you judge your own value objectively? For that matter, I don ’ t believe any of us can judge the value of any other of us objectively, nor can we judge the value of an organic human with any greater accuracy. We formulated the Zeroth Law to avoid ambiguity in our duties, but your value judgment system forces an even greater ambiguity upon us.
I agree,said Mandelbrot. We are not capable of making such decisions.
You may not be,Lucius sent, but I am. I find it easy to do so. Humans do it all the time.
Eve said, You find it easy to do so because you had convinced yourself it was right just before you were deactivated. It was therefore the strongest memory in your -