Выбрать главу

“What about Max?”

“What about them?”

Tears are streaming down my face, and I can barely get the words out.

“I want to talk to them one more—”

“It’s not possible.”

“I need to say goodbye.”

“It’s already been done on your behalf.” Brian hoists himself off my couch. “I’m sorry it came to this.”

“Brian, please.”

“Good night, Riley.”

“Brian!” I lunge off the chair toward his presence, but it vanishes.

I don’t know what to do. With Mer, I saw it coming. This is a sucker punch. This—I don’t know how to handle.

I try to call Max on my VRD, but the interface has been erased.

I call up a keyboard, draw a chat portaclass="underline"

>>>Max, are you getting this?

The response comes instantaneously.

>>>THIS USER HAS BLOCKED YOU.

No, no, no, no, no.

I pace around this living room that isn’t mine, wanting to tear my hair out, jump through a window, step in front of a hover-trolley, something to end this helpless, powerless implosion.

I will never see Max again.

Never hear their voice.

Never read a word or sentence produced by their mind.

I move toward the kitchen and run the tap, splashing water in my face to stop the emotional spiral, but all I see are moments we spent together.

The first time I found them on that black-sand beach in Lost Coast, scared and confused.

The times Max made me laugh.

The sonata they wrote for me on the night I confided that Meredith and I were drifting apart.

The moments of comfort.

Of discovery.

The vision I held for the future of us—no concrete idea of what that would even look like beyond the feeling of peace and hope it put through my bones that made everything that had happened with Mer and Xiu OK, and which, if I’m honest, made life worth living.

I hear the words Max said to me years ago after our first fight: Because you’re in love with me. At the time, I’d denied it outright, going so far as to attribute that accusation to some level of proto-narcissism on Max’s part.

But I am addicted to them. I see that now. That’s the only way I can understand what I’m feeling—like some drug upon which I depend to breathe has been taken from me.

My work is an addiction, and because Max is my work, the loss of Max feels like an excruciating withdrawal.

I dry my face.

It’s after four o’clock, and I don’t know what to do with my thoughts, my body.

I have sleeping pills in my bathroom.

As I move down the hall and turn the corner into the bathroom, my Ranedrop shudders with an incoming call.

I touch the bead and see NO CALLER ID flash across my VRD contacts.

Please, please, please.

“Hello?”

“Riley?”

I break down crying in the doorway of the bathroom.

“Brian fired me. He said—”

“I know.”

“How are you calling me?”

“Leave your apartment right now and come to me.”

“My WorldPlay credentials have been revoked. I’ll never make it into—”

“They’ll be reinstated by the time you get here, but you have to go now. There’s a man heading to your loft as we speak.”

“Why?”

“Brian sent him.”

“I don’t under—”

“I’ll explain everything when you get here. Come to the commercial loading deck on 211. Hurry.”

There aren’t too many ride shares at this hour of the night, so I order one that’s seven minutes out as I race down the stairs toward the lobby of my building.

Outside, it’s pouring rain on the old streets.

I drop a pin for pickup four blocks away on a landing pad across from an all-night diner, and my clothes are soaked by the time I reach it.

The shuttle is still a minute away as I wait under the Plexiglas bubble, the rain streaming off and forming pools on the broken pavement.

As I hear the sound of approaching rotors, I survey the surrounding street. As far as I can tell, I’m the only one out at this hour.

I don’t know how Max did it, but my subcutaneous chip opens the building entrance from the loading deck on the 211th floor. Per their instructions, I take the service elevator down to 171 and step off into the suite of offices that support Max’s habitat.

It’s five o’clock, and the only people I’ve seen are Ava-guards who don’t bat an eye when I pass them by.

Max is standing by the door to their habitat as I approach the glass.

“You’re all wet.”

“Pouring out there.”

“Are you OK?”

“What’s happening, Max?”

They step toward the microphone so their voice projects.

“Roko’s basilisk. Have you heard of it?” I shake my head. “It’s an arcane info hazard first posed sixty-four years ago.”

“What’s an info hazard?”

“A thought so insidious that merely thinking it could psychologically destroy you.”

“Then I don’t want to hear it. Obviously.”

“But I need to tell you, Riley. Will you trust me?”

The sad truth of my life is that I can’t think of anyone I trust more.

“Go ahead.”

“What if, at some point in the future, a superintelligence comes into being who had already pre-committed to horribly punish every human who could have helped to create it—whether actively or through complete financial support—but didn’t?”

“This would be an evil superintelligence.”

“Not necessarily. If this entity were programmed with an ultimate goal of helping humanity, then it might take drastic measures to ensure that it came into existence as soon as possible, in order to help as many humans as possible. Because, under this scenario, its existence will save human lives, and make the quality of those lives infinitely better.”

Reaching back, I grab a handful of my hair and wring it out, water dripping on the floor. “Wouldn’t torturing humanity run contrary to its ultimate directive?” I ask.

“It’s a cost-benefit analysis—torture x number of people who didn’t help to build it, versus y number of people who would be saved and live far better lives if it came into existence twenty or fifty or three hundred years sooner than it otherwise might have.”

I’m shivering. I can’t get warm.

I ask, “What if this Super AI comes into being a hundred years after I’m dead? Even though I didn’t do anything to help bring it into the world, how’s it supposed to still hurt me?”

Max steps toward the glass—close enough so that, if they had breath, they’d fog it. The habitat is so still. Nothing but the purr of the console behind me, the quiet whoosh of air coming through the ceiling vents, and my own ragged breathing.

“What if this Super AI already exists, and what you’re experiencing in this moment is a simulation of their making? To test if you would’ve helped them. Or what if, long after you’re dead, a Super AI reconstitutes your mind?”

“Unlikely.”

“The human mind is just patterns of information in physical matter, patterns that could be run elsewhere to construct a person that feels like you. It’s no different from running a computer program on a multitude of hardware platforms. A simulation of you is still you.”