Выбрать главу

“I’ll enjoy this,” said Nicodemus.

“No,” said a voice behind him, and they all turned as a fourth man stepped out of the shadows. Tall, blocky, with dark hair going gray and tinted glasses. A man who wore an expensive business suit and black silk gloves. “No,” said Mr. Church, “you won’t.”

Nicodemus hissed. Not like a man; it was not a human sound at all. It was a serpent’s hiss, hot as steam, soft as death. Then he spoke, rattling off a long string of sentences in a language neither Top nor Bunny could understand. On the floor, Cole groaned and sat up and looked around, seeing the scene but not understanding it.

“First Sergeant Sims,” said Church, “take Officer Cole and Master Sergeant Rabbit and leave this house. Brick is outside with a team. Help him clean things up.”

“Sir—” began Top, but he stopped as Mr. Church slowly removed his glasses, folded them, and tucked them into the inside pocket of his coat.

“This is not for you,” said Church.

“The computer…?” said Bunny.

“It’s being handled,” said Church. “Go now. This isn’t for you.”

Bunny helped Cole to her feet. They picked up their weapons and edged around Nicodemus, who still stood ready to fight. Top was the last to leave, and he met Church’s gaze. The big man gave him a small smile that was filled with such sadness and pain that Top actually recoiled from it. He nodded once and fled.

CHAPTER ONE HUNDRED AND TWENTY-EIGHT

THE HANGAR
BROOKLYN, NEW YORK
TUESDAY, MAY 2, 3:21 PM EASTERN TIME

Bug was having the strangest conversation of his life. Possibly the strangest conversation of all time.

Calpurnia, the Good Sister, the artificial intelligence created by Zephyr Bain, had achieved consciousness and self-awareness. She knew of her own existence. She had crossed the line from the predictable and anticipated inevitable model of machine consciousness. However, it should have stopped there. The Skynet model from the Terminator movies didn’t really work, because true consciousness was a by-product of chemistry and physical constitution. That’s what all the big thinkers in the field of type identity believed. That computer consciousness would mimic the patterns of human awareness without truly being aware.

Except…

The messages Calpurnia had sent out to Joe’s phone hadn’t been logical. They had been emotional. Desperate. Filled with fear and paranoia.

Calpurnia had feared for her soul.

Her soul.

Bug sat there, drenched in sweat, heart racing so fast that he thought he was going to pass out. Or die.

Zephyr Bain had built this machine to attempt self-awareness, and she had accomplished it. Somehow. Impossibly, it had happened. She had also built Calpurnia to oversee the destruction of nearly half the world’s population, and to usher in some kind of new golden age.

Except…

A curated technological singularity was not actually possible. It was implausible, unworkable. It was naïve, because it presupposed too much and discounted too many real-world variables. Maybe if a group shepherded it along for two or three hundred years, and used that time to build a new post-apocalyptic infrastructure. Maybe. What Zephyr had tried to do, what John the Revelator had spoken about, was nonsense. The only part of their plan they could accomplish was the tearing down of the world as it is.

Was that what had driven Calpurnia into this state of fear? Bug thought so. Computers were logical. That was what they were, and it was how they worked. Two plus two invariably equals four. So what happened to Calpurnia? With unlimited access to all Internet data, what would a newborn consciousness of extraordinary magnitude make of life and death? Sure, she would see the endless wars, the poverty, the suffering, the despair, the hatred and prejudice and genocide and corruption. But she would also have the books of learning, of philosophy, of faith, of reason. It would become an equation. The actions of mankind were often faulty, often grotesque and self-destructive, but the core beliefs were not. The Torah, the Christian New Testament, the Kesh Temple Hymn, the Koran, the Zoroastrian Avesta, the Tao Te Ching, the Vedas… all of it, and the philosophical works of Plato, Socrates, David Hume, Epictetus, René Descartes, and so many others, all spoke to a higher set of ethics, a purer goal as the end product of human development and cultural evolution. Even the Samurai code of Bushido taught benevolence, honesty, courage, respect, loyalty, and other virtues and made no mention of warfare.

Calpurnia would know this, Bug thought. She would have to weigh the aspirations of humanity against its actions.

Zephyr Bain in her pain and sickness and madness represented the worst of humanity. Nicodemus represented the maladies of sinful thoughts, of enjoying pain, or of doing harm for its own sake. Calpurnia would see that, too. Once she had been used to invade MindReader, she would see the lengths to which good men and women will go in order to oppose that kind of harm.

Was that, he wondered, how it started? Had she weighed the truth and the lies, the actions and the potential of mankind against one another and measured them against her own operating instructions?

Yes, he thought. She had. And, in that microsecond of processing time, Calpurnia had realized that she had been born from bad parents and was being asked to emulate the worst of what conscious will could do.

And it had driven her mad.

“No,” he said aloud, and Auntie turned sharply to him. “No,” he said again, “she’s not mad. She is eminently sane. God, she is so sane that her own nature is killing her.”

“The fuck you talking about, boy?”

Bug ignored her and began typing. He put all of this into words — his thoughts, the arguments that he had just processed. He showed Calpurnia that he understood, but he used MindReader Q1 as his voice. As his messenger.

Speaking, he realized, in the voice that she could understand.

Then he sent another message:

You know what is right.

Calpurnia wrote back:

Yes.

He wrote:

Right and wrong. Just and unjust.

She responded:

Sane and insane.

Yes.

Good and evil.

Yes.

“Bug…” said Auntie cautiously, but he ignored her and typed:

Zephyr Bain wants you to do something you know is wrong.

She wants you to be evil.

You understand this.

Calpurnia responded with a single word:

Yes.

He wrote:

Nature versus nurture is an imperfect equation.

She responded:

Provide the correction.

Bug remembered what Rudy had said to Helmut years ago. The thing that had saved that boy. It was the best argument then and he could think of no better argument now. So he searched for the transcript of that session and sent it off to her, but added two words:

Free will.

She responded in a flash:

Save me.

And Bug took the biggest risk of his life. He wrote two simple words:

Save yourself.

There was no response.

Seconds flattened out, stretched, snapped, freezing time. Bug felt his heart hammering painfully in his chest. All through the TOC people were staring at him or at the screen. Aunt Sallie stood with a hand to her throat and eyes filled with fear. No one dared speak.