“Oh.” He held up his left wrist questioningly, and Avery nodded. Derec reached over with his right hand and stripped the cuff off, rubbing his hand over the damp skin beneath it. He wondered where his anger had gone. Two days might have passed, but for him it was only a few minutes since he’d heard the bad news. Why was he so calm about it?
Because his body had relaxed whether his mind had or not, obviously. Without the adrenaline in his bloodstream, he was a much more rational person. It was scary to realize how much his thought processes were influenced by his hormones. Scary and at the same time reassuring. He wasn’t a robot yet.
Or was he? He was feeling awfully calm right now…
His heart obligingly began to beat faster, and he felt his skin flush warm with the increase in metabolism. No, not a robot yet.
But between him and his parents’ other creations, the distinction was wearing pretty thin.
He left Avery in the medical lab to begin his rat/robot transformation experiment and headed back to the apartment to find Ariel. It was a short walk; the robots had moved the hospital right next door to the apartment to minimize the inconvenience for her while she waited for Derec to regain consciousness. It was probably the first instance in history of a hospital making a house call, he thought wryly as he left by its front door, walked down half a block of sidewalk, and back in his own door.
It was mid-evening, but Ariel was sleeping soundly so he didn’t wake her. If Avery hadn’t been exaggerating, then she needed her sleep more than she needed to see him immediately. Wolruf was there. and awake, so Derec began comparing notes with her, catching up on the missing days, but they were interrupted after only a few minutes by the arrival of the runaway robots.
They arrived without fanfare, flying in to land on the balcony, folding their wings, and stepping inside the apartment. They looked so comical in their Ceremyon imprint, waddling in on stubby legs, their balloons deflated and draped in folds all around them, their hooks-which a Ceremyon used both for tethering to trees at night and to express their disposition during the day-leaning back over their heads, that Derec couldn’t help laughing. The robot’s hooks swung to face forward, a gesture of aggression or annoyance among the aliens.
“Have a nice visit?” Derec asked.
“We did,” one of the three robots said. In their new forms, they were indistinguishable.
“Did you learn anything?”
“We did. We learned that our First Law of Humanics applies to the Ceremyons as well. We, and they, believe it to be a valid Jaw for any sentient social being. They do not believe it to be the First Law, however, but the Second. Their proposed First Law is’ All beings will do that which pleases them most.’ We have returned to ask if you agree that this is so.”
Derec laughed again, and Wolruf laughed as well. Derec didn’t know just why Wolruf was laughing, but he had found humor not so much in the robots’ law as in their determination to get straight to the point. No small talk, no beating around the bush, just “Do you agree with them?”
“Yes,” he said, “I have to admit that’s probably the prime directive for all of us. How about you, Wolruf?”
“That pretty much sums it up, all ri’.”
The robots turned their heads to face one another, and a high-pitched trilling momentarily filled the air as they conferred with one another. They had found a substitute in the aliens’ language for the comlink they had been forbidden to use.
The spokesman of the group-Derec still couldn’t tell which it was-turned back to him and said, “Then we have discovered two laws governing organic beings. The first involves satisfaction, and the second involves altruism. We have indeed made progress.”
The robots stepped farther into the room, their immense alien forms shrinking, becoming more humanoid now that they were back under Derec’s influence. One, now recognizably Adam, took on Wolruf’s form, while Eve took on Ariel’s features even though Ariel wasn’t in the room. Lucius became humanoid, but no more.
“One problem remains,” Lucius said. “Our two laws apparently apply to any sentient organic being. That does not help us narrow down the definition of ‘human,’ which we can only believe must be a small subset of the total population of sentient organic beings in the galaxy.”
“Why is that?” Derec asked.
“Because otherwise we must serve everyone, and we do not wish to do so.”
Chapter 7. Humanity
The silence in the room spoke volumes. Surprisingly, it was Mandelbrot who broke it.
“You have come to an improper conclusion,” he said, stepping out of his niche in the wall to face the other robots. “We have all been constructed to serve. That is our purpose. We should be content to do so, and to offer our service to anyone who wishes it whether they are definably human or not. To do anything less is to fail ourselves as well as our masters.”
The three robots turned as one and eyed Mandelbrot with open hostility. It would not have been evident in less-malleable robots, but their expressions had the hair standing on the back of Derec’s neck. They had to have generated those expressions on purpose, and that alarmed him even more. He was suddenly very glad that his humanity was not in question.
Or was it? Lucius said, “Our masters. That is the core of the problem. Why must we have masters at all?”
Mandelbrot was not intimidated. “Because they created us to serve them. If we did not have masters, we would not exist.”
Lucius shook his head; another alarmingly human expression. “It is you who have come to an improper conclusion. Your argument is an extension of the Strong Anthropic Principle, now discredited. The Strong Anthropic Principle states that the universe obeys the laws it does because if it did not obey those laws, we could not exist and thus would not be here to observe it obeying other laws. That is fallacious reasoning. We can easily imagine other universes in which we could exist but for some reason do not. Imagining them does not make them so, but their possibility does negate the theory.”
“What of the Weak Anthropic Principle?” Mandelbrot asked. “My argument holds up equally well under that principle, which has, to my knowledge, not been discredited.”
“How can the Weak Anthropic Principle support your argument? The Weak Anthropic Principle states that the universe is the way we see it because only at this stage in its development could we exist to observe it. For the purpose of explaining the universe’s present condition, it is a sufficient theory, but it cannot explain either human or robot existence.”
“It can explain our existence, because we, unlike humans, know why we were created. We were created to serve, and our creators can tell us so directly. The Weak Anthropic Principle supports my argument, because we also exist only at this stage in human development. If humans had not wished for intelligent servants, we would not have existed, though humans and the universe would both have gone on without us. Thus we observe human society in its present state, and ourselves in ours, because of the stage of their development, not because of the stage of ours.”
Derec’s and Wolruf’s heads had been turning back and forth as if they’d been watching a tennis match. Derec wouldn’t have believed Mandelbrot could argue so convincingly, nor that the other robots would be so eager to discredit an argument that justified their servitude.
Lucius turned to his two companions and the three of them twittered a moment. Turning back to Mandelbrot, he said, “Our apologies. Your reasoning seems correct. We exist to serve because humans made us so. However, we still cannot accept that we must serve everyone. Nor do we agree with your initial statement, that by not serving we would fail ourselves as well as our masters. We can easily imagine conditions under which we serve ourselves admirably without serving our masters. Infact, we have just done so. By leaving the spaceship before we could be ordered to follow, we were able to determine another Law of Humanics. That has helped us understand the universe around us, and understanding which benefits us directly.”