Wolruf said. “I’m not sure about w’at they might decide to do on their own, and I’m not sure about w’at might ‘appen to us even if they just follow orders.”
“I don’t understand.”
“She’s talking about protecting people from themselves,” Ariel said.
“Am I?”
“Sure you are. I’ve been thinking about it, too. The problem with robot cities is that they’re too responsive. Anything you want them to do, they’ll do it, so long as it doesn’t hurt anybody. The trouble is, they don’t reject stupid ideas, and they don’t think ahead. “
“That’s the people’s job,” Janet said.
“Just w’atone of the robots in the forest told me,” Wolruf said. “Trouble is, people won’t always do it. Or w’en they realize they made a mistake, it’ll be too late.”
Janet looked to Derec. “Pessimistic lot you run around with.”
“They come by it honestly,” he said, grinning. “We’ve been burned more than once by these cities. Just about every time, it’s been something like what they’re talking about. Taking things too literally, or not thinking them through.”
“Isn’t Central supposed to be doing that?”
“Central is really just there to coordinate things,” Derec said. “It’s just a big computer, not very adaptable.” He looked down at Basalom again, nodded to Ariel to have her shine the light inside again as well, and peered inside the robot’s shoulder. After a moment he found what he was looking for, reached gingerly inside, and grunted with the strain of pushing something stubborn aside. The something gave with a sudden click and the stump of the robot’s arm popped off, trailing wires.
“There’s also a committee of supervisory robots,” Ariel said, “but they don’t really do any long-range planning either. And they’re all subject to the Three Laws, so anybody who wants to could order them to change something, and unless it clearly hurt someone else, they’d have to do it.”
“No matter how stupid it was,” Janet said.
“Right.” Derec unplugged the wires between Basalom’s upper arm and the rest of his body.
Janet looked thoughtful. “Hmmm,” she said. “Sounds like what these cities all need is a mayor. “
“Mayor?” Wolruf asked.
“Old human custom,” Janet replied. “A mayor is a person in charge of a city. He or she is supposed to make decisions that affect the whole city and everyone in it. They’re supposed to have the good of the people at heart, so ideally they make the best decisions they can for the largest number of people for the longest period of time. “
“Ideally,” Wolruf growled. “We know ‘ow closely people follow ideals.”
“People, sure.” Janet waved a hand toward the four robots in the comer. “But how about dedicated idealists?”
Ariel was so startled she dropped the light. It clattered to the floor and went out, but by the time she bent down to retrieve it, it was glowing again, repaired.
“Something wrong, dear?” Janet asked her.
“You’d let one of them be in charge of a city?”
“Yes, I would.”
“And you’d live there?”
“Sure. They’re not dangerous.”
“Not dangerous! Look at what-”
“Lucius made the right decision, as far as I’m concerned.”
“Maybe,” Ariel said. “What worries me is the thought process he went through to make it.” She clicked off the light; Derec wasn’t working on Basalom anymore anyway. He was staring at Ariel and Janet as if he’d never heard two people argue before. Ariel ignored his astonished look and said, “The greatest good for the greatest number of people. That could easily translate to ‘the end justifies the means., Are you seriously suggesting that’s a viable operating principle?”
“We’re not talking an Inquisition here,” Janet said.
“But what if we were? What if the greatest good meant killing forty-nine percent of the population? What if it meant killing just one? Are you going to stand there and tell me it’s all right to kill even one innocent person in order to make life easier for the rest?”
“Don’t be ridiculous. That’s not what we’re talking about at all. “
It took conscious effort for Ariel to lower her voice. “It sure is. Eventually that sort of situation is going to come up, and it scares the hell out of me to think what one of those robots would decide to do about it. “
Janet pursed her lips. “Well,” she said, “why don’t we ask them, then?”
Lucius looked for the magnetic containment vessel he was sure must be waiting for him somewhere. Not finding one, he looked for telltale signs of a laser cannon hidden behind one of the walls. He didn’t find that, either, but he knew there had to be something he couldn’t see, some way of instantly immobilizing him if he answered wrong. The situation was obviously a test, and the price of failure was no doubt his life.
He’d been roused out of comlink fugue and immediately barraged with questions, the latest of which was the oddest one he’d ever been asked to consider, even by his siblings.
“Let me make sure I understand you,” he said. “The person in question is not a criminal? He has done no wrong? Yet his death would benefit the entire population of the city?”
“That’s right.”
Ariel’s stress indicators were unusually high, but Lucius risked his next question anyway. “How could that be?”
“That’s not important. The important thing is the philosophical question behind it. Would you kill that person in order to make life better for everyone else?”
“I would have to know how it would make their lives better. “
“We’re talking hypothetically,” Janet said. “Just assume it does.”
Do you have any idea what the underlying intent is here?Lucius asked via comlink. Perhaps it was cheating, but no one had forbidden him to consult the other robots. A pity Basalom was not on line; his experiences with Janet might provide a clue to the proper answer.
Neither Adam nor Eve answered, but Mandelbrot did. Yesterday I overheard Ariel and Wolruf discussing the possible effect of a robot city on Wolruf ’ s world. Wolruf was concerned that the use of robots would strip her people of the ability to think and act for themselves. Perhaps this question grew out of that concern.
I think there is more to it than that,Lucius sent. Central, can you replay the conversation that led up to this question?
The robots received the recorded conversation within milliseconds, but it took them considerably longer to sort it all out. At last Lucius said, I believe it is clear now. They are concerned about the moral implications of unwilling sacrifice.
Agreed,the others all said.
Do we have any precedent to go upon?
Possibly,Eve said. There could have been innocent people on Aranimas ’ s ship. We know that Aranimas took slaves.Yet destroying it to save a city full of Kin was still a proper solution.
That doesn ’ t quite fit the question we are asked to consider,said Adam. A better analogy might be to ask what if the ship had been crewed only by innocent people?
Innocent people would not have been in that situation alone,Lucius replied.
Mandelbrot said, Aranimas could easily have launched a drone with hostages on board.
Then the hostages would have to be sacrificed,Lucius said immediately. They would be no more innocent than the people on the ground.