Nobody says anything. I wonder if this is about ego or pride. Engineers hate a wipe and reinstall. It’s a last resort, an admittance of defeat. The dreaded cry of “reboot,” which is to say we have no clue and hopefully the issue will sort itself if we start over, if we clear the cache.
“Are you sure you can’t think of anything else that might be wrong with him?” Peter asks. He and Greenie join me at the other end of the trailer. Again, that weird look on their faces. It’s more than exhaustion. It’s some kind of wonder and fear.
“What do you know that you aren’t telling me?” I ask.
“It’s what we think,” Greenie says.
“Fucking tell me. Jesus Christ.”
“We needed a clear head to look at this,” Peter says. “Another set of eyes.” He glances at Greenie. “If she doesn’t see it, then maybe we’re wrong…”
But I do see it. Right then, like a lightning bolt straight up my spine. One of those thoughts that falls like a sledgehammer and gives you a mental limp for the rest of your life, that changes how you walk, how you see the world.
“Hell no,” I say.
The boys say nothing. Max seems to twitch uncomfortably at the far end of the trailer. And I don’t think I’m projecting this time.
“Max, why don’t you want your arms?”
“Just I don’t want them,” he says. I’m watching the monitors instead of him this time. A tactical module is running, and it shouldn’t be. Stepping through each line, I can see the regroup code going into a full loop. There are other lines running in parallel, his sixty-four processors running dozens of routines all at once. I didn’t notice the regroup code until I looked for it. It’s the closest thing we’ve ever taught him to retreat. Max has been programmed from the ground up to fight until his juice runs out. He knows sideways and forward, and that’s it.
“You have a big bout in two days,” I tell Max.
Another surge of routines, another twitch in his power harness. If his legs were plugged in, I imagine he’d be backing away from me. Which is crazy. Not only have we never taught him anything like what he’s trying to pull off, we never instructed him to teach himself anything like this.
“Tell me it’s just a glitch,” Greenie says. He almost sounds hopeful. Like he doesn’t want it to be anything else. Peter is watching me intently. He doesn’t want to guide me along any more than he has to. Very scientific of him. I ignore Greenie and focus on our robot.
“Max, do you feel any different?”
“No,” Max says.
“Are you ready for your next bout?”
“No.”
“Why not?”
No response. He doesn’t know what to say. I glance at the screen to get a read on the code, but Peter points to the RAM readout, and I see that it has spiked. No available RAM. It looks like full combat mode. Conflicting routines.
“This is emergent,” I say.
“That’s what I told him,” Peter says. He perks up.
“But emergent what?” Greenie asks. “Because Peter thinks—”
“Let her say it,” Peter says, interrupting. “Don’t lead her.” He turns to me. There’s a look on his face that makes him appear a decade younger. A look of wonder and discovery. I remember falling in love with that look.
And I know suddenly what Peter wants me to say. I know what he’s thinking, because I’m thinking it too. The word slips between my lips without awareness. I hear myself say it, and I feel like a fool. It feels wonderful.
“Sentience,” I say.
We live for emergent behaviors. It’s what we hope for. It’s what we fight robots for. It’s what we program Max to do.
He’s programmed to learn from each bout and improve, to create new routines that will improve his odds in future fights. The first time I wrote a routine like this, it was in middle school. I pitted two chess-playing computers with basic learning heuristics against one another. Summer camp stuff. I watched as a library of chess openings was built up on the fly. Nothing new, just the centuries old rediscovered in mere hours. Built from nothing. From learning. From that moment on, I was hooked.
Max is just a more advanced version of that same idea. His being able to write his own code on the fly and save it for the future is the font of our research. Max creates new and original software routines that we patent and sell to clients. Sometimes he introduces a glitch, a piece of code that knocks him out of commission, what evolution handles with death, and we have to back him out to an earlier revision. Other times he comes up with a routine that’s so far beyond anything else he knows, it’s what we call emergent. A sum that’s greater than its parts. The moment a pot of water begins to boil.
There was the day he used his own laser to cut a busted leg free because it was slowing him down. That was one of those emergent days. Max is programmed at a very base level not to harm himself. He isn’t allowed to turn his weapons against his own body. It’s why his guns won’t fire when part of him gets in the way, similar to how he can’t swing a leg and hurt us by accident.
But one bout, he decided it was okay to lop off his own busted leg if it meant winning and preventing further harm. That emergent routine funded half of our following season. And his maneuver—knowing when to sacrifice himself and by how much—put us through to the finals two years ago. We’ve seen other Gladiators do something similar since. But I’ve never seen a Gladiator not want to fight. That would require one emergent property to override millions of other ones. It would be those two chess computers from middle school suddenly agreeing not to play the game.
“Max, are you looking forward to training today?”
“I’d rather not,” Max says. And this is the frustrating part. We created a facsimile of sentience in all our machines decades ago. We programmed them to hesitate, to use casual vernacular; we wanted our cell phones to seem like living, breathing people. It strikes me that cancer was cured like this—so gradually that no one realized it had happened. We had to be told. And by then it didn’t seem like such a big deal.
“Shit, look at this,” Peter says.
I turn to where he’s pointing. The green HDD indicator on Max’s server bank is flashing so fast it might as well be solid.
“Max, are you writing code?” I ask.
“Yes,” he says. He’s programmed to tell the truth. I shouldn’t even have to remind myself.
“Shut him down,” Greenie says. When Peter and I don’t move, Greenie gets off his stool.
“Wait,” I say.
Max jitters, anticipating the loss of power. His charging cables sway. He looks at us, cameras focusing back and forth between me and Greenie.
“We’ll get a dump,” Greenie says. “We’ll get a dump, load up the save from before the semis, and you two can reload whatever the hell this is and play with it later.”
“How’s my team?” a voice calls from the ramp. We turn to see Professor Hinson limping into the trailer. Hinson hasn’t taught a class in decades, but still likes the moniker. Retired on a single patent back in the twenties, then had one VC hit after another across the Valley. He’s a DARPA leech, loves being around politicians. Would probably have aspirations of being President if it weren’t for the legions of coeds who would come out of the woodworks with stories.
“SoCal is out there chewing up sparring partners,” Hinson says. “We aiming for dramatic suspense in here?”
“There might be a slight issue,” Greenie says. And I want to fucking kill him. There’s a doubling of wrinkles across Hinson’s face.
“Well then fix it,” Hinson says. “I pay you all a lot of money to make sure there aren’t issues.”