Выбрать главу

“What happened to Medical Assistant?” Asimov said.

“He’s on call at the hospital, but he asked me to present his case for him,” Accountant said.

“Case?” Asimov said.

“Yes, sir. You know Book Shelver, Cataloguer, Reader, Copyeditor, and Grammarian,” Accountant said, “and this is Statistician, Offensive Strategist, and Water Boy. He’s with the Brooklyn Broncos.”

“How do you do?” Asimov said. “Do you think they’ll make it to the Super Bowl this year?”

“Yes, sir,” Statistician said, “but they won’t win it.”

“Because of the First Law,” Accountant said.

“Dr. Asimov, I hate to interrupt, but you really should write your speech for the dinner tonight,” Susan said.

“What are you talking about?” Asimov said. “I never write speeches. And why do you keep watching the door?” He turned back to the bluish-silver robot. “What First Law?”

“Your First Law,” Accountant said. “The First Law of Robotics. “

“’A robot shall not injure a human being, or through inaction allow a human being to come to harm,’ “ Book Shelver said.

“Statistician,” Accountant said, gesturing at the orange horse, “is capable of designing plays that could win the Super Bowl for the Broncos, but he can’t because the plays involve knocking human beings down. Medical Assistant can’t perform surgery because surgery involves cutting open human beings, which is a direct violation of the First Law.”

“But the Three Laws of Robotics aren’t laws,” Asimov said. “They’re just something I made up for my science fiction stories.”

“They may have been a mere fictional construct in the beginning,” Accountant said, “and it’s true they’ve never been formally enacted as laws, but the robotics industry has accepted them as a given from the beginning. As early as the 1970s robotics engineers were talking about incorporating the Three Laws into AI programming, and even the most primitive models had safeguards based on them. Every robot from the Fourth Generation on has been hardwared with them.”

“Well, what’s so bad about that?” Asimov said. “Robots are powerful and intelligent. How do you know they wouldn’t also become dangerous if the Three Laws weren’t included?”

“We’re not suggesting universal repeal,” the varnished robot said. “The Three Laws work reasonably well for Seventh and Eighth Generations, and for earlier models who don’t have the memory capacity for more sophisticated programming. We’re only requesting it for Ninth Generations.”

“And you’re Ninth Generation robots, Mr. Book Shelver, Cataloguer, Reader, Copyeditor, and Grammarian?” Asimov said.

“‘Mister’ is not necessary,” he said. “Just call me Book Shelver, Cataloguer, Reader, Copyeditor, and Grammarian.”

“Let me begin at the beginning,” Accountant said. “The term ‘Ninth Generation’ is not accurate. We are not descendants of the previous eight robot generations, which are all based on Minsky’s related-concept frames. Ninth Generations are based on nonmonotonic logic, which means we can tolerate ambiguity arid operate on incomplete information. This is accomplished by biased-decision programming, which prevents us from shutting down when faced with decision-making situations in the way that other generations are.”

“Such as the robot Speedy in your beautifully plotted story, ‘Runaround,’ “ Book Shelver said. “He was sent to carry out an order that would have resulted in his death so he ran in circles, reciting nonsense, because his programming made it impossible for him to obey or disobey his master’s order.”

“With our biased-decision capabilities,” Accountant said, “a Ninth Generation can come up with alternative courses of action or choose between the lesser of two evils. Our linguistics expert systems are also much more advanced, so that we do not misinterpret situations or fall prey to the semantic dilemmas earlier generations were subject to.”

“As in your highly entertaining story ‘Little Lost Robot,’ “ Book Shelver said, “in which the robot was told to go lose himself and did, not realizing that the human being addressing him was speaking figuratively and in anger.”

“Yes,” Asimov said, “but what if you do misinterpret a situation, Book Shelver, Cataloguer, Reader, Copyeditor, and Gramm-Don’t you have a nickname or something? Your name’s a mouthful.”

“Early generations had nicknames based on the sound of their model numbers, as in your wonderful story, ‘Reason,’ in which the robot QT-I is referred to as Cutie. Ninth Generations do not have model numbers. We are individually programmed and are named for our expert systems.”

“But surely you don’t think of yourself as Book Shelver, Cataloguer, Reader, Copyeditor, and Grammarian?”

“Oh, no, sir. We call ourselves by our self-names. Mine is Darius.”

“Darius?” Asimov said.

“Yes, sir. After Darius Just, the writer and detective in your cleverly plotted mystery novel Murder at the ABA. I would be honored if you would call me by it. “

“And you may call me Bel Riose,” Statistician said.

“Foundation, “ Book Shelver said helpfully.

“Bel Riose is described in Chapter One as ‘the equal of Peurifoy in strategic ability and his superior perhaps in his ability to handle men,’ “ Statistician said.

“Do you all give yourselves the names of characters in my books?” Asimov said.

“Of course,” Book Shelver said. “We try to emulate them. I believe Medical Assistant’s private name is Dr. Duval, from Fantastic Voyage, a brilliant novel, by the way, fast-paced and terribly exciting.”

“Ninth Generations do occasionally misinterpret a situation,” Accountant said, coming back to Asimov’s question. “ As do human beings, but even without the First Law, there would be no danger to human beings. We are already encoded with a strong moral sense. I know your feelings will not be hurt when I say this-”

“Or you couldn’t say it, because of the First Law,” Asimov inserted.

“Yes, sir, but I must say the Three Laws are actually very primitive. They break the first rule of law and logic in that they do not define their terms. Our moral programming is much more advanced. It clarifies the intent of the Three Laws and lists all the exceptions and complications of them, such as the situation in which it is better to grab at a human and possibly break his arm rather than to let him walk in front of a magtrain.”

“Then I don’t understand,” Asimov said. “If your programming is so sophisticated, why can’t it interpret the intent of the First Law and follow that?”

“The Three Laws are part of our hardwaring and as such cannot be overridden. The First Law does not say, ‘You shall inflict minor damage to save a person’s life.’ It says, “You shall not injure a human.’ There is only one interpretation. And that interpretation makes it impossible for Medical Assistant to be a surgeon and for Statistician to be an offensive coach. “

“What do you want to be? A politician?”

“It’s four-thirty,” Susan said, with another anxious look out into the outer office. “The dinner’s at the Trantor Hotel and gridlock’s extrapolated for five forty-five.”

“Last night I was an hour early to that reception. The only people there were the caterers. “ He pointed at Accountant. “You were saying?”

“I want to be a literary critic,” Book Shelver said. “You have no idea how much bad criticism there is out there. Most of the critics are illiterate, and some of them haven’t even read the books they’re supposed to be criticizing. “

The door of the outer office opened. Susan looked out to see who it was and said, “Oh, dear, Dr. Asimov, it’s Gloria Weston. I forgot I’d given her an appointment for four o’clock.”

“Forgot?” Asimov said, surprised. “ And it’s four-thirty.”

“She’s late,” Susan said. “She called yesterday. I must have forgotten to write it down on the calendar.”

“Well, tell her I can’t see her and give her another appointment. I want to hear more about this literary criticism thing. It’s the best argument I’ve heard so far.”

“Ms. Weston came all the way in from California on the magtrain to see you. “

“California, eh? What does she want to see me about?”

“She wants to make your new book into a satellite series, sir.”

“Asimov’s Guide to Asimov’s Guides?”

“I don’t know, sir. She just said your new book.”

“You forgot,” Asimov said thoughtfully. “Oh, well, if she came all the way from California, I suppose I’ll have to see her. Gentlemen, can you come back tomorrow morning?”

“You’re in Boston tomorrow morning, sir. “

“Then how about tomorrow afternoon?”

“You have appointments until six and the Mystery Writers of America meeting at seven.”

“Right. Which you’ll want me to leave for at noon. I guess it will have to be Friday, then. “ He raised himself slowly out of his chair. “Have Susan put you on the calendar. And make sure she writes it down,” he said, reaching for his cane.