Выбрать главу

 I pointed out it would be difficult to write science fiction if the stories were to be centered on the Bicentennial, but Naomi said that the stories could be anything at all provided they could be seen to have arisen out of the phrase "The Bicentennial Man."

 I was intrigued and agreed to do it. I was handed half the advance at once. The deadline was April 1, 1975, and by March 14 I was finished. I was a little rueful about the story at first, for the agreement had called for a 7,500-word story and I had been unable to stop it before it had reached 15,000 -the longest story I had written below the level of a novel in seventeen years. I write an apologetic covering letter, assuring Naomi that there would be no extra charge and she wrote back to say the extra wordage would be fine. Pretty soon, the remaining half of the advance arrived..

 But then everything went wrong. Naomi was beset by family and medical problems; some writers who it had been hoped would participate, couldn't; others who promised stories didn't deliver them; those who did deliver them did not turn out entirely satisfactory products.

 Of course, I didn't know anything about this. It never even occurred to me that anything might go wrong. Actually, my only large interest is in writing. Selling is a minor interest, and what happens afterward is of almost no interest.

 There was, however, Judy-Lynn del Rey and her enormous awareness of everything that goes on in science fiction. She knew that I had written a story for this anthology.

 "How is it," she asked dangerously, "that you wrote a story for that anthology, yet when I ask you for one you're always too busy?"*

 "Well, " I said apologetically, for Judy-Lynn is a frightening creature when she is moved, "the idea of the anthology interested me."

 "How about my suggestions about a robot that has to choose between buying its own liberty and improving its -body? I thought you said that was interesting."

 At that point, I must have turned approximately as white as talcum powder. A long time before, she had mentioned such things and I had forgotten. I said, "Oh, my goodness, I included something of the sort in the story."

 "Again?" she shrieked. "Again you're using my ideas for other people? Let me see that story. Let me see it!"

 So I brought her a carbon copy the next day and the day after that she called me. She said, "I tried hard not to like the story, but I didn't manage. I want it. Get the story back."

 "I can't do that," I said. "I sold it to Naomi and it's hers. I'll write you a different story."

 "1'11 bet you anything you like," said Judy-Lynn, "that that anthology isn't going to go through. Why don't you call and ask?'.

 I called Naomi and, of course, it wasn't going through. She agreed to send me back the manuscript and grant me permission to sell the story elsewhere, and I sent back the advance she had given me. (After all, she had lost considerable money on the venture, and I didn't want any of that loss to represent a profit to me.)

 The story was then transferred to Judy-Lynn, who used it in her anthology of originals entitled Stellar Science Fiction #2, which appeared in February 1976. And I like the story so much myself that I not only am including it here, but am using its title for the book as a whole.

 (Incidentally, after this book was put together, Judy-Lynn suggested I change my manuscript to make it jibe with the version in Stellar. Apparently, she had introduced numerous minor changes that improved it, she said. Well, I am not Harlan Ellison, so I don't mind, but I think that in my own collection, I'll let the story stand as I wrote it. Judy-Lynn will be annoyed, but she can't do worse than kill me.) *This was during the Passover Seder, over which Lester del Rey presides every year with enormous effectiveness, since he is the best cook in science fiction.

The Bicentennial Man

The Three Laws of Robotics

1. A robot may not injure a human being, or, through inaction, allow a human being to come to harm.

2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

1.

Andrew Martin said, "Thank you," and took the seat offered him. He didn't look driven to the last resort, but he had been.

He didn't, actually, look anything, for there was a smooth blankness, to his face, except for the sadness one imagined one saw in his eyes. His hair was smooth, light brown, rather fine; and he had no facial hair. He looked freshly and cleanly shaved. His clothes were distinctly old-fashioned, but neat, and predominantly a velvety red-purple in color.

Facing him from behind the desk was the surgeon. The nameplate on the desk included a fully identifying series of letters and numbers which Andrew didn't bother with. To call him Doctor would be quite enough.

"When can the operation be carried through, Doctor?" he asked.

Softly, with that certain inalienable note of respect that a robot always used to a human being, the surgeon said, "I am not certain, sir, that I understand how or upon whom such an operation could be performed."

There might have been a look of respectful intransigence on the surgeon's face, if a robot of his sort, in lightly bronzed stainless steel, could have such an expression- or any expression.

Andrew Martin studied the robot's right hand, his cutting hand, as it lay motionless on the desk. The fingers were long and were shaped into artistically metallic, looping curves so graceful and appropriate that one could imagine a scalpel fitting them and becoming, temporarily, one piece with them. There would be no hesitation in his work, no stumbling, no quivering, no mistakes. That confidence came with specialization, of course, a specialization so fiercely desired by humanity that few robots were, any longer, independently brained. A surgeon, of course, would have to be. But this one, though brained, was so limited in his capacity that he did not recognize Andrew, had probably never heard of him.

"Have you ever thought you would like to be a man?" Andrew asked.

The surgeon hesitated a moment, as though the question fitted nowhere in his allotted positronic pathways. "But I am a robot, sir."

"Would it be better to be a man?"

"If would be better, sir, to be a better surgeon. I could not be so if I were a man, but only if I were a more advanced robot. I would be pleased to be a more advanced robot."

"It does not offend you that I can order you about? That I can make you stand up, sit down, move right or left, by merely telling you to do so?"

"It is my pleasure to please you, sir. If your orders were to interfere with my functioning with respect to you or to any other human being, I would not obey you. The First Law, concerning my duty to human safety, would take precedence over the Second Law relating to obedience. Otherwise, obedience is my pleasure. Now, upon whom am I to perform this operation?"

"Upon me," Andrew said.

"But that is impossible. It is patently a damaging operation."

"That does not matter," said Andrew, calmly. "I must not inflict damage," said the surgeon. "On a human being, you must not," said Andrew, "but I, too, am a robot."

2.

Andrew had appeared much more a robot when he had first been- manufactured. He had then been as much a robot in appearance as any that had ever existed, smoothly designed and functional.