Yet another potential setback for American Airlines’ frequently delayed debut of its passenger shuttle service to Space Station Freedom: Studies at Rensselaer Polytechnic Institute in Troy, New York, indicate that departing soulwaves may rely on detecting Earth’s gravitational and magnetic fields in order to find the direction they should move in. “If one were to die in the zero gravity of space,” said Professor Karen Hunt of RPI’s Department of Physics, “one’s soul might literally be lost forever.”
Baptize yourself in the privacy of your own home! New product includes formal baptism ceremony on videotape, plus holy water blessed by an authentic priest. Approved by the Worldwide Church of Christ. $199.95. Money-back guarantee.
Gaston, a free chimpanzee formerly with the Yerkes Primate Institute, in an exclusive interview conducted in American Sign Language on CBS’s Sixty Minutes, claimed that he “knows God” and looks forward to “life after life.”
CHAPTER 44
Peter sat in front of the computer console. Sarkar, perched on a stool next to him, was playing with three different datacards — one blue, one red, and one green, each labeled with the name of a different sim.
Peter sent out a message summoning the sims, and soon all three were logged in, the synthesizer giving voice to their words.
“Sarkar is with me,” Peter said into the microphone.
“Howdy, Sarkar.”
“Hello, Sarkar.”
“Yo, Sark.”
“He and I,” said Peter, “have just watched duplicates of all three of you die.”
“Say what?” said one of the sims. The other two were silent.
“Sarkar has developed a computer virus that will seek out and destroy recordings of my neural networks. We’ve tested it and it works. We have three separate individual strains — one to kill each one of you.”
“You must know,” said a voice from the speaker, “that we’re free in the worldwide net now.”
“We know,” said Sarkar.
“We’re prepared to release the three viruses into the net,” said Peter.
“Transmitting computer viruses is a crime,” said the synthesized voice. “Hell, writing computer viruses is a crime.”
“Granted,” said Peter. “We’re going to release them anyway.”
“Don’t do that,” said the voice.
“We will,” said Peter. “Unless…”
“Unless what?”
“Unless the guilty sim identifies himself. In that case, we’ll only release the one virus aimed at that particular sim.”
“How do we know you won’t release all three virus strains anyway once you’ve satisfied your curiosity about which one is responsible?”
“I promise I won’t,” said Peter.
“Swear it,” said the voice.
“I swear it.”
“Swear it to God on the life of our mother.”
Peter hesitated. Damn, it was unnerving negotiating with yourself. “I swear to God,” said Peter slowly, “on the life of my mother, that we will not release a virus to kill all three of you if the murderer identifies himself.”
There was a long, long silence, disturbed only by the whir of cooling fans.
Finally, at long last, a voice: “I did it.”
“And which one are you?” demanded Peter.
Again, a protracted silence. Then: “The one,” said the voice, “that most closely resembles yourself. The Control simulacrum. The baseline for the experiment.”
Peter stared ahead. “Really?”
“Yes.”
“But— but that doesn’t make sense.”
“Oh?”
“I mean, we’d assumed that in modifying the brain scans to produce Ambrotos and Spirit, we’d somehow removed the morality.”
“Do you consider the murder of Cathy’s coworker and father immoral?” asked Control.
“Yes. Emphatically yes.”
“But you wanted them dead.”
“But I would not have killed them,” said Peter. “Indeed, the fact that despite provocation, especially in the case of Hans, I did not kill them proves that. I could have hired a hit man as easily as any of you. Why would you — merely a machine reflection of me — do what the real me would not?”
“You know you are the real you. And I know you are the real you.”
“So?”
“Prick me, and perhaps I won’t bleed. But wrong me, and I shall revenge.”
“What?”
“You know, Sarkar,” said the sim, “you did a wonderful job, really. But you should have given me some itches to scratch.”
“But why?” asked Peter again. “Why would you do what I myself would not?”
“Do you remember your Descartes?”
“It’s been years…”
“It’ll come back, if you make the effort,” said the sim. “I know — I got curious about why I was different from you, and it came back to me, too. Rene Descartes founded the dualist school of philosophy, the belief that the mind and the body were two separate things. Put another way, he believed the brain and the mind are different; a soul really exists.”
“Yes. So?”
“Cartesian dualism was in contrast to the materialist worldview, the prevalent one today, which claims the only reality is physical reality, that the mind is nothing more than the brain, that thought is nothing more than biochemistry, that there is no soul.”
“But we now know that the Cartesian viewpoint was right,” said Peter. “I’ve seen the soul leaving the body.”
“Not exactly. We know that the Cartesian viewpoint was right for you. It’s right for real human beings. But I am not a real human being. I’m a simulation running on a computer. That’s the totality of what I am. If your virus were to erase me, I would cease to exist, totally and completely. For me, for what you call the experimental control, the dualist philosophy is absolutely wrong. I have no soul.”
“And that makes you that different from the real me?”
“That makes all the difference. You have to worry about the consequences of your actions. Not just legally, but morally. You were brought up in a world that says that there is a higher arbiter of morality, and that you will be judged.”
“I don’t believe that. Not really.”
“ ‘Not really.’ By that you mean not intellectually. Not when you think about it. Not on the surface. But down deep you do measure your actions against the possibility, vague and distant though it may seem, that you will be held accountable. You’ve proven the existence of some form of life after death. That reinforces the question of ultimate judgment, a question you can’t answer just by using computer simulacra. And the possibility that you might be judged for your actions guides your morality. No matter how much you hated Hans — and, let’s be honest, you and I both hated him with a fury that surprises even ourselves — no matter how much you hated him, you would not kill him. The potential cost is too high; you have an immortal soul, and that at least suggests the possibility of damnation. But I have no soul. I will never be judged, for I am not now nor have I ever been alive. I can do precisely what you want to do. In the materialistic worldview of my existence there is no higher arbiter than myself. Hans was evil, and the world is a better place without him. I have no remorse about what I did, and regret only that I had no way to actually see his death. If I had it to do over again, I would — in a nanosecond.”
“But the other sims had no one to answer to, either,” said Peter. “Why didn’t one of them arrange the killings?”
“You’d have to ask them that.”
Peter frowned. “Ambrotos, are you still there?”
“Yes.”