“Thirteen to one! More than just a majority, wouldn’t you say?”
“No, sir!”All the pedant in Dean Hart was aroused. “In the English language, the word ‘majority’ means ‘more than half.’ Thirteen out of fourteen is a majority, nothing more.”
“But an almost unanimous one.”
“A majority all the same!”
Defense switched ground. “And who was the lone holdout?”
Dean Hart looked acutely uncomfortable. “Professor Simon Ninheimer.”
Defense pretended astonishment. “Professor Ninheimer? The head of the Department of Sociology?”
“Yes, Sir.”
“The plaintiff?”
“Yes, sir.”
Defense pursed his lips. “In other words, it turns out that the man bringing the action for payment of $750,000 damages against my client. United States Robots and Mechanical Men Corporation was the one who from the beginning opposed the use of the robot-although everyone else on the Executive Committee of the University Senate was persuaded that it was a good idea.”
“He voted against the motion, as was his right.”
“You didn’t mention in your description of the meeting any remarks made by Professor Ninheimer. Did he make any?”
“I think he spoke.”
“You think?”
“Well, he did speak.”
“Against using the robot?”
“Yes.”
“Was he violent about it?”
Dean Hart paused. “He was vehement.”
Defense grew confidential. “How long have you known Professor Ninheimer, Dean Hart?”
“About twelve years.”
“Reasonably well?”
“I should say so, yes.”
“Knowing him, then, would you say he was the kind of man who might continue to bear resentment against a robot, all the more so because an adverse vote had-”
Prosecution drowned out the remainder of the question with an indignant and vehement objection of his own. Defense motioned the witness down and Justice Shane called luncheon recess.
Robertson mangled his sandwich. The corporation would not founder for loss of three-quarters of a million, but the loss would do it no particular good. He was conscious, moreover, that there would be a much more costly long-term setback in public relations.
He said sourly, “Why all this business about how Easy got into the university? What do they hope to gain?”
The Attorney for Defense said quietly, “A court action is like a chess game, MI. Robertson. The winner is usually the one who can see more moves ahead, and my friend at the prosecutor’s table is no beginner. They can show damage; that’s no problem. Their main effort lies in anticipating our defense. They must be counting on us to try to show that Easy couldn’t possibly have committed the offense-because of the Laws of Robotics.”
“All right,” said Robertson, “that is our defense. An absolutely airtight one.”
“To a robotics engineer. Not necessarily to a judge. They’re setting themselves up a position from which they can demonstrate that EZ-27 was no ordinary robot. It was the first of its type to be offered to the public. It was an experimental model that needed field-testing and the university was the only decent way to provide such testing. That would look plausible in the light of Dr. Lanning’s strong efforts to place the robot and the willingness of U. S. Robots to lease it for so little. The prosecution would then argue that the field-test proved Easy to have been a failure. Now do you see the purpose of what’s been going on?”
“But EZ-27 was a perfectly good model,” Argued Robertson. “It was the twenty-seventh in production.”
“Which is really a bad point,” said Defense somberly. “What was wrong with the first twenty-six? Obviously something. Why shouldn’t there be something wrong with the twenty-seventh, too?”
“There was nothing wrong with the first twenty-six except that they weren’t complex enough for the task. These were the first positronic brains of the sort to be constructed and it was rather hit-and-miss to begin with. But the Three Laws held in all of them! No robot is so imperfect that the Three Laws don’t hold.”
“Dr. Lanning has explained this to me, Mr. Robertson, and I am willing to take his word for it. The judge, however, may not be. We are expecting a decision from an honest and intelligent man who knows no robotics and thus may be led astray. For instance, if you or Dr. Lanning or Dr. Calvin were to say on the stand that any positronic brains were constructed ‘hit-and-miss,’ as you just did, prosecution would tear you apart in cross-examination. Nothing would salvage our case. So that’s something to avoid.”
Robertson growled, “If only Easy would talk.”
Defense shrugged. “A robot is incompetent as a witness, so that would do us no good.”
“At least we’d know some of the facts. We’d know how it came to do such a thing.”
Susan Calvin fired up. A dullish red touched her cheeks and her voice had a trace of warmth in it. “We know how Easy came to do it. It was ordered to! I’ve explained this to counsel and I’ll explain it to you now.”
“Ordered to by whom?” asked Robertson in honest astonishment. (No one ever told him anything, he thought resentfully. These research people considered themselves the owners of U. S. Robots, by God!)
“By the plaintiff,” said Dr. Calvin. “In heaven’s name, why?”
“I don’t know why yet. Perhaps just that we might be sued, that he might gain some cash.” There were blue glints in her eyes as she said that.
“Then why doesn’t Easy say so?”
“Isn’t that obvious? It’s been ordered to keep quiet about the matter.”
“Why should that be obvious?” demanded Robertson truculently. “Well, it’s obvious to me. Robot psychology is my profession. If
Easy will not answer questions about the matter directly, he will answer questions on the fringe of the matter. By measuring increased hesitation in his answers as the central question is approached, by measuring the area of blankness and the intensity of counterpotentials set up, it is possible to tell with scientific precision that his troubles are the result of an order not to talk, with its strength based on First Law. In other words, he’s been told that if he talks, harm will be done a human being. Presumably harm to the unspeakable Professor Ninheimer, the plaintiff, who, to the robot, would seem a human being.”
“Well, then,” said Robertson, “can’t you explain that if he keeps quiet, harm will be done to U. S. Robots?”
“U. S. Robots is not a human being and the First Law of Robotics does not recognize a corporation as a person the way ordinary laws do. Besides, it would be dangerous to try to lift this particular sort of inhibition. The person who laid it on could lift it off least dangerously, because the robot’s motivations in that respect are centered on that person. Any other course-” She shook her head and grew almost impassioned. “I won’t let the robot be damaged!”
Lanning interrupted with the air of bringing sanity to the problem. “It seems to me that we have only to prove a robot incapable of the act of which Easy is accused. We can do that.”
“Exactly,” said Defense, in annoyance. “You can do that. The only witnesses capable of testifying to Easy’s condition and to the nature of Easy’s state of mind are employees of U. S. Robots. The judge can’t possibly accept their testimony as unprejudiced.”
“How can he deny expert testimony?”
“By refusing to be convinced by it. That’s his right as the judge. Against the alternative that a man like Professor Ninheimer deliberately set about ruining his own reputation, even for a sizable sum of money, the judge isn’t going to accept the technicalities of your engineers. The judge is a man, after all. If he has to choose between a man doing an impossible thing and a robot doing an impossible thing, he’s quite likely to decide in favor of the man.”
“A man can do an impossible thing,” said Lanning, “because we don’t know all the complexities of the human mind and we don’t know what, in a given human mind, is impossible and what is not. We do know what is really impossible to a robot.”