Выбрать главу

“Think media, Susan, where your average person gets most of his or her information. How many movies have you seen where creatures with artificial intelligence set out to topple their human creators? Anything with superhuman, human, or near-human intelligence is presumed to eventually try to slaughter or enslave us, whether it’s machine or organic, golem, android, or alien.”

Susan had to agree. “Well, it definitely makes for better movies. It’s hard to generate fear and excitement if the bug-eyed monsters come only to hug us. Surely that doesn’t mean everyone —”

“Not everyone,” John admitted. “But enough people in power do. Then we have to worry about general public perception, political groups, religious affiliates —”

It was Susan’s turn to interrupt. “And the Society for Humanity.”

John immediately stopped speaking. “You’re full of surprise information today. How do you know about them?”

Susan chewed and swallowed completely, washing it down with juice. “They’re a thorn in my side, too. In addition to being antirobot, they have an injunction against treating one of my patients. They’re a powerful group with a megaton of cheek.”

“Yeah.” The word emerged strained.

When Susan examined her father closely, she thought she saw trembling fingers and moisture in his eyes. “Isn’t there some way to make robots absolutely safe? Something in their programming?”

John’s discomfort turned to a shaky smile. “There is. It’s the reason USR has the only legal permit to manufacture robots, at least in the United States.”

Now he had Susan’s full attention. “What is it?”

“It’s our patented Three Laws of Robotics. They’re a fundamental part of every robot we manufacture, and we do not continue building or programming until it has become an integral part of them.”

Susan had to know. “And the three laws are?”

John cleared his throat. She suspected he knew them forward and backward, that he could recite them in anagram form or even in his sleep. “Law Number One: ‘A robot may not injure a human being, or, through inaction, allow a human being to come to harm.’ ”

Susan nodded with each word, trying to commit them to memory verbatim. This first law, by itself, seemed like enough to keep the citizenry safe.

“Law Number Two: ‘A robot must obey all orders given by human beings except where such orders would conflict with the First Law.’ ”

Susan could understand that one as well. It would keep the robots definitively subservient and ensure humans could not manipulate them into killing other humans.

“Law Number Three: ‘A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.’ ”

“Hmm.” Susan grinned. “That last sounds like it protects your investment more than actually being necessary to keep humans safe.”

“At face value, yes.” John continued to stare out the window, as if accessing old memories better left buried. “But it can have important human applications. If, for example, a robot witnesses a murder or is asked to participate in a crime, this prevents the killer from ordering its robot witness to destroy itself.”

It suddenly occurred to Susan that the three laws, though cautiously worded and direct, did have deeper implications well worth exploring. She wondered how humans would behave were they wholly governed by those same three laws. It might make an interesting psychological experiment. She finished the last of her casserole and juice. “Thanks for finally sharing. I’m looking forward to learning more and to talking to Nate now that I know his underlying . . .” She did not know what to call it. “Governance” seemed closest; but, given his near humanity, “religion” or “moral philosophy” seemed better. She settled for the obvious. “Laws. It’ll be interesting to see how the need to abide by those makes him different from humans.”

“Different?” John Calvin laughed. “Now you’re psychoanalyzing robots? What’s next, the bookshelves?”

Susan did not see the humor in it. “What happens, for example, when a robot is given conflicting commands? Or he’s told to take a bath by someone who doesn’t know an electrical appliance has fallen into the tub? That could put laws two and three into conflict.” Several more potential problems came to her mind. “Let’s say a robot impervious to temperature is in the Antarctic with a freezing and comatose human. Can he figure out the human’s life is in danger from something that can’t harm the robot? Or let’s say he’s ordered into a burning building, but he knows everyone inside is already dead? Law Number Three would force him to refuse.”

John Calvin’s expression gradually changed from one of tolerant humor to interest. “You’ve thought of everything.”

A thrill of excitement swept through Susan. “Not nearly. There’re thousands of possible scenarios.”

“None of which have come up.”

Susan added, “Yet. But when robots become an everyday commodity, they will, and probably quite frequently.”

John did not seem convinced. “Maybe.” He reached for Susan’s dirty dishes and changed the subject. “Now, tell me about those fascinating patients.”

Susan rose, shaking her head. “I will. I promise. But, first, I have to do some research on those fascinating patients, or I’m not going to make it through my fascinating residency.” Walking into the main room, she snatched up her palm-pross. “By tomorrow, I have to know how to diagnose subtle abnormalities of the optic disk and their significance. Also, why a child with a normal heart might be in florid cardiac failure, how to treat chronic and resistant mutism, and what to do with a four-year-old homicidal maniac.”

John Calvin called from the kitchen. “Wow. That does sound interesting.” Dishes rattled. “We can talk when you’re finished. Until then, I won’t bug you.”

It was a lie, and Susan knew it. At strategic times during the evening, he would interrupt her studies to ascertain her every comfort. She could count on an irresistible dessert at the least. For now, though, she took her palm-pross into her bedroom and began her search.

Susan Calvin met with the Moores immediately after rounds, in the room between two sets of locked doors installed especially for such meetings. She found them sitting in the plush mauve chairs provided for parental comfort. When the door opened, Mr. Moore rose, and a wan smile appeared on Mrs. Moore’s face. They both appeared exhausted, the father’s short, black hair speckled with gray and the mother’s long and braided into permanent extensions.

Susan perched on the arm of a chair, meeting their gazes in turn. “I’m Susan Calvin, Dallas’ new resident doctor.”

They nodded, familiar with the monthly drill. They had met a parade of residents, though usually not this early in the month. Breakthroughs did not often come about this quickly.

Mrs. Moore sighed. “What did he do this time?”

Susan smiled reassuringly, though inwardly she cringed. She wondered what it must be like to have people constantly condemning one’s beloved, if difficult, child. “Nothing we wouldn’t expect from someone with his syndrome.”

“Syndrome?” Mr. Moore repeated. A tall, well-muscled man, he leaned forward in his chair. “Are they calling depression a syndrome now?”

Susan explained her findings from earlier that morning. “Have you heard of Prader-Willi syndrome?”

The Moores nodded wearily. “A resident thought Dallas had it a few months ago. Tests showed he didn’t.”

“Right.” Quailed by the intensity of the parental gazes, Susan wished she had a window on which to focus her attention. She had confronted families before, but always as a medical student observing residents or attendings. She had never held the spotlight, and it unnerved her. The parents hung on her every word, as they should. The future of their child currently rested on her diagnoses. “It’s a rare chromosomal defect that causes, among other things, inadequate development of an area of the brain called the hypothalamus.”