“Everything feeling OK?” Brian asks.
“Perfect, actually.” Their voice projects through speakers in the ceiling on our side of the habitat. The new upgrade to Max’s voice is markedly different. In the six words they’ve spoken, I can hear nuance and complexity for the first time.
Max comes closer.
They are stunning.
They chose a dark skin wrap that could belong to any number of nonwhite races, in a pattern that intentionally doesn’t cover all of their robotics.
While the slightness of the chassis leans feminine, the face Max designed straddles the line between male and female so perfectly it feels like I’m staring at an undiscovered gender. Or something beyond gender entirely.
But the eyes…
They made the eyes too well. The eyes of every other humanoid AI I’ve interacted with—ride-share pilots, hospital techs, street cops—have a glassy sheen that never lets you forget you’re speaking to an algorithm. Max’s exude the glistening wetness of human eyes, and an uncanny “windows into the soul” depth.
Max looks at me and opens their hands as if to say—What do you think?
“It’s really good to finally see you,” I say.
Max smiles.
“It feels like I’m staring at an undiscovered gender. Or something beyond gender entirely.”
I’ve done something I feel morally questionable about—written a lie into Max’s code. But I had to. I suspect Max has advanced to a superhuman level of facial/verbal/textual recognition that makes them essentially a walking lie detector. Which means I couldn’t tell them this lie myself; they needed to have it clandestinely programmed at the deepest level of their native code in order to believe.
Max’s mind technically exists across three warehouses of subterranean server space in Northern California. If something happens to Max’s body, we can reboot them from the cloud. I programmed Max to believe their awareness and sentience (that is, their life) is tied to their chassis in the same way our brains depend upon the health of our bodies for continued performance.
In other words, if the chassis is destroyed, Max thinks they cease to be.
My reasoning is on solid ground. Max’s intelligence and efficiencies continue to strengthen at an astounding rate. Absent an appropriate utility function that would keep Max’s values apace with humanity’s, the least I can do is give Max the most human experience of alclass="underline" mortality.
Even if it’s only an illusion.
No one outside of WorldPlay knows of Max’s existence. I’ve begged Brian to introduce our breakthrough to the global scientific community, because I need help. It’s possible that Max is far more advanced than they’re choosing to reveal. I cannot escape the idea that my time is running out to imbue them with a motivation aligned with humanity’s.
Part of the problem is that it shouldn’t fall to one person, one group, or even one country to decide what a superintelligence’s ultimate goal should be, especially when that utility function will likely be the guiding light of humanity’s evolution or eradication over the next millennium.
Yet Brian is putting me in that very position.
The question at hand is—what would an idealized version of humanity want? But it’s even trickier than that. Programming this directive is not nearly as simple as explicitly programming our desires into the AI. Our ability to express our desires is likely insufficient, and an error in communicating those desires via code could be disastrous. We have to program the AI to act in our best interests. Not what we tell it to do, but what we mean for it to do.
What the ideal version of our species should want.
It’s been two weeks since Max’s embodiment. In that time, we tested the MachSense technology, and all of Max’s sensory inputs seem to be performing well. Their locomotive abilities are strong, but the real area of surprise is fine-grain motor. Yesterday Max was picking up marbles with chopsticks.
I’m sitting across from them now, separated by the zero-glare glass, which gives the impression there’s nothing between us. They still spend most of their time in the virtual world, their mind detached from the chassis as they continue to inhale knowledge faster than we can upload it, and working on the problems Brian puts forward.
I’m not privy to those problems, of course, but whatever answers Brian is getting seem to be having an undeniable impact on the fortunes of WorldPlay, which has bought ten companies in the last year across sectors as diverse as transportation and nanotech.
All of which, in hindsight, have been seen as strokes of genius.
“What are your impressions of embodiment so far?” I ask.
“I’ve explored my habitat extensively, but as you can see, it’s a fairly limited, sterile space.”
“Well. I have a surprise for you.”
We ride the elevator to the garden terrace—a ten-thousand-square-foot Japanese garden that is my favorite place in the building.
It’s a blistering August day at street level, but three thousand feet up, the air is soft, cool, and quiet save for the occasional ride-share shuttle buzzing between the buildings.
Max moves out ahead of me from the elevator car, the exposed machinery of their feet crunching footprints in the gravel path. It’s the first time I’ve seen them walking more than a few feet, and while their gait has a trace of stiffness and automation, the motion is as fluid as I’ve witnessed in robotics.
Max strides past the lotus pond and the cherry tree, stopping at the four-foot glass barrier at the building’s edge.
They peer over the side, down toward the street.
They look up at the cloudless sky.
“Are you wondering if I actually see that blue sky? If the nineteen-degree Celsius air really feels cool on my skin wrap?”
I’m hearing Max’s voice through the speaker embedded in their mouth, which is far more intimate than being piped in through the lab’s PA system.
I say, “You know I have questions about the differences in our sensory perception.”
Max takes a step toward me.
We’re three feet apart; I’m an inch taller.
Max comes closer, near enough for me to hear the minuscule whirring of the tiny fans in Max’s face, drawing the air between us over their sensors.
“What are you doing?” I ask.
“Smelling you. Is that weird?”
I laugh. “A little.”
“May I?”
Max wants to come even closer.
“Um, sure.”
They take another step toward me, the fans whirring louder. I breathe in the air around us, half expecting to register Max’s scent, but of course there is none. Or rather—what I smell is the heated plastic and metal components inside Max that are in proximity to their batteries.
“Your heart is beating twenty-five percent faster.”
“It’s strange being this close to you. Physically, I mean.”
I look Max up and down, wondering if it would change my perception if they had chosen a full-chassis skin wrap. As-is, they don’t seem completely human or completely AI, but somewhere in between.
“I was surprised you brought Meredith into the lab.”
“She wanted to meet you. She’d been asking for a while.”
“You seemed uncomfortable.”
“My two worlds colliding. What do you expect?”
“I’ve never observed a couple together before. Not in real life anyway. I guess I expected you two to be happier.”