Some part of my brain seizes on every transitory pattern, every spurious rhythm—and, when each pattern unwinds, each rhythm decays, only strains harder to discern the next.
‘Up. Down. Up. Up. Down. Down. Up. Down. Up. Up. Down. Down. Up. Up. Up. Down. Up. Down. Up. Down.’
Primed, I should have no trouble shutting this out, ignoring it. But incredibly, I can’t. Lee was right—and P3 is clearly no better than Sentinel. I can’t stop listening.
‘Up. Down. Up. Down. Down. Down. Down. Up. Down. Down. Down. Up. Down. Up. Up. Up. Up. Down. Down. Down.’
Worst of all, I find myself—unwillingly, compulsively—trying to guess each direction the instant before it’s called. No, worse: trying to change it. Trying to impose some order. If I can’t shut out this meaningless droning, the next best thing would be to force it to make some kind of sense.
Chung Po-kwai, I imagine, feels the same.
Each session lasts fifteen minutes, with a ten-minute break in between. Ms Chung emerges from the ion room—wearing wrap-around sunglasses to keep her eyes from losing too much dark adaptation—to sip tea, stretch her legs, and tap out snatches of odd rhythms with her fingertips on equipment casings. She speaks to me briefly, the first time, but then conserves her voice. The scientists ignore us both, busily reviewing their data and running esoteric statistical tests.
Each time the experiment restarts, I resolve to force myself to ignore the insidious random chant; after all, P3 may have failed me, but, primed or not, I should have some vestige of native self-control. I don’t succeed, but eventually I change tactics, and manage to reach a kind of equilibrium where at least I’m no longer compounding the problem by struggling, in vain, to attain the state of perfect vigilance to which I’m accustomed.
The scientists don’t seem troubled at all—but then, it’s data to them, not noise; they’re under no obligation to try to ignore it.
So far as I can tell, the results don’t improve as the experiment progresses, but I do notice one odd thing which I hadn’t picked up before: the histograms are changing after each direction is called. It’s easiest to see this when there’s a run of ions all in one direction; most of the histograms grow steadily lopsided, and this trend doesn’t reverse until the ion that breaks the run has actually been announced. But if the computers are collecting data straight from the equipment, this order of events is puzzling; whatever elaborate calculations are required to update the histograms, it’s unlikely that they’d take more than a couple of microseconds to perform—which is certainly less than the time-lag between a human seeing a flash of light and announcing that it’s ‘up’ or ‘down’. Meaning what? The computers aren’t plugged into the experiment? They’re getting their data second-hand, by listening to Chung Po-kwai’s words? That makes no sense at all. Maybe the scientists simply find the results easier to follow this way, so they’ve programmed in an intentional delay.
Dr Leung finally calls a halt at 20:35. While the three remain huddled about the console, debating the sensitivity of the sixth moment of the binomial distribution, Ms Chung nudges me and whispers, ‘I’m starving. Let’s get out of here.’
In the elevator, she takes out a small vial and sprays her throat. She explains, ‘I’m not allowed to use this during the experiment—it’s full of analgesics and antiinflammatory drugs, and they insist that I remain unsullied by pharmaceuticals.’ She coughs a few times, then says, no longer hoarse, ‘And who am I to argue?’
The ASR tower has its own private restaurant, on the eighteenth floor. Ms Chung informs me, gleefully, that her contract includes unlimited free food. She slips her ID card into a slot in the table, and illustrated menus appear, embedded in the table’s surface. She orders quickly, then glances up at me, puzzled.
‘Aren’t you going to eat?’
‘Not while I’m on duty.’
She laughs, disbelieving. ‘You’re going to fast for twelve hours? Don’t be ridiculous. Lee Hing-cheung ate on duty. Why shouldn’t you?’
I shrug. ‘I expect we have different mods. The mod that controls my metabolism is designed to cope with short periods of fasting—in fact, it does a better job keeping my blood sugar at the optimal level if I don’t complicate things by eating.’
‘What do you mean, “complicate things”?’
‘After a meal, there’s usually an insulin overshoot—you know, that slightly drowsy feeling that comes with satiety.
That can be controlled, to some degree, but it’s simpler if I rely on steady glycogen conversion.’
She shakes her head, half amused, half disapproving, and looks around the crowded restaurant. Steam rises from every table, drawn up in neat columns by the silent tug of the ceiling ducts. ‘But… isn’t the smell of all this enough to make you ravenous?’
‘The connection is decoupled.’
‘You mean you have no sense of smell?’
‘No, I mean it has no effect on my appetite. All the usual sensory and biochemical cues are disabled. I can’t feel hungry; it’s impossible.’
‘Ah.’ A robot cart arrives and deftly unloads her first course. She takes a mouthful of what I think is squid, and chews it rapidly. ‘Isn’t that potentially dangerous?’
‘Not really. If my glycogen reserves dropped below a certain level, I’d be informed—with a simple, factual message from the relevant mod, which it would then be up to me to act on. As opposed to persistent hunger pangs, which might distract me from something more pressing.’
She nods. ‘So you’ve forced your body to stop treating you like a child. No more crude punishments and rewards to encourage correct behaviour; animals might need that shit to survive, but we humans are smart enough to set our own priorities.’ She nods again, begrudgingly. ‘I can see the attraction in that. But where do you draw the line?’
‘What line?’
‘The line between “you” and “your body”… between the drives you acknowledge as “your own”, and the ones you treat as some kind of imposition. Sure, why be inconvenienced by hunger? But then, why be distracted by sex? Or why give in to the urge to have children? Why let yourself be affected by grief? Or guilt? Or compassion? Or logic? If you’re going to set your own priorities, there has to be someone left to have priorities.’ She looks at me pointedly, as if she half expects me to leap onto the table and publicly renounce appetite suppression forever, now that I’ve been warned of the horrors to which it might lead. I don’t have the heart to tell her that she’s too late, on every count.
I say, ‘Everything you do changes who you are. Eating changes who you are. Not eating changes who you are. Spraying your throat with analgesics changes who you are. What’s the difference between using a mod to switch off hunger, and using a drug to switch off pain? It’s just the same.’
She shakes her head. ‘You can trivialize anything that way; everything’s “just the same as” something less extreme. But neural mods are not “just the same as” analgesics. There are mods that change people’s values—’
‘And they never changed before?’
‘Slowly. For good reasons.’
‘Or bad reasons. Or none at all. What do you think: the average person sits down one day and constructs some kind of meticulously rational moral philosophy—which they modify appropriately, if and when they discover its flaws? That’s pure fantasy. Most people are just pushed around by the things they live through, shaped by influences they can’t control. Why shouldn’t they alter themselves—if it’s what they want, if it makes them happy?’