Выбрать главу

“Goodnight, Melancholy” won the 2016 Yinhe Award. Like much of Xia Jia’s recent fiction, it belongs to a loosely connected series called “The Chinese Encyclopedia.” These stories take place in the same near-future universe, where ubiquitous AI, VR, AR, and other technologies present age-old questions about how and why we remain human in new forms, and tradition and modernity are not simple binary opposites, but partners in a complicated dance.

More of Xia Jia’s fiction and nonfiction may be found in Invisible Planets.

GOODNIGHT, MELANCHOLY

LINDY (1)

I remember the first time Lindy walked into my home.

She lifted her tiny feet and set them down gingerly on the smooth, polished wooden floor, like a child venturing onto freshly fallen snow: trembling, hesitating, afraid to dirty the pure white blanket, terrified of sinking into and disappearing beneath the featureless fluff.

I held her hand. Her soft body was stuffed with cotton, and the stitches, my own handiwork, weren’t very neat. I had also made her a scarlet felt cape, like the ones in the fairy tales I had read as a child. Her two ears were of different lengths, and the longer one drooped, as though dejected.

Seeing her, I couldn’t help but remember all the experiences of failure in my life: eggshell puppets that I had ruined during crafts class; drawings that didn’t look like what they were supposed to be; stiff, awkward smiles in photographs; chocolate pudding burnt to charcoal; failed exams; bitter fights and breakups; incoherent classroom reports; papers that were revised hundreds of times but ultimately were unpublishable…

Nocko turned his fuzzy little head to regard us, his high-speed cameras scanning, analyzing Lindy’s form. I could almost hear the computations churning in his body. His algorithms were designed to respond only to speaking subjects.

“Nocko, this is Lindy.” I beckoned him over. “Come say hi.”

Nocko opened his mouth; a yawn-like noise emerged.

“Behave.” I raised my voice like a mother intent on discipline.

Reluctantly, Nocko muttered to himself. I knew that this was a display intended to attract my affection and attention. These complicated, pre-formulated behaviors were modeled on young children, but they were key to the success of language-learning robots. Without such interactive behavior feedback, Nocko would be like a child on the autistic spectrum who cannot communicate meaningfully with others despite mastering a whole grammar and vocabulary.

Nocko extended a furry flipper, gazed at me with his oversized eyes, and then turned to Lindy. The designer had given him the form of a baby white seal for a reason: anybody who saw his chubby cheeks and huge, dark eyes couldn’t help but let down their guard and feel the impulse to give him a hug, pat his head, and tell him, “Awww, so good to meet you!” Had he been made to resemble a human baby, the uncanny valley would have filled viewers with dread at his smooth, synthetic body.

“Hel-lo,” he said, enunciating carefully, the way I had taught him.

“That’s better. Lindy, meet Nocko.”

Lindy observed Nocko carefully. Her eyes were two black buttons, and the cameras were hidden behind them. I hadn’t bothered to sew a mouth for her, which meant that her facial expressions were rather constrained, like a princess who had been cursed to neither smile nor speak. I knew, however, that Lindy could speak, but she was nervous because of the new environment. She was being overwhelmed by too much information and too many choices that had to be balanced, like a complicated board situation in weiqi in which every move led to thousands of cascading future shifts.

My palm sweated as I held Lindy’s hand; I felt just as tense.

“Nocko, would you like Lindy to give you a hug?” I suggested.

Pushing off the floor with his flippers, Nocko hopped a few steps forward. Then he strained to keep his torso off the floor as he spread his foreflippers. The corners of his mouth stretched and lifted into a curious and friendly grin. What a perfect smile. I admired him silently. What a genius design. Artificial intelligence researchers in olden times had ignored these nonlinguistic interactive elements. They had thought that “conversation” involved nothing more than a programmer typing questions into a computer.

Lindy pondered my question. But this was a situation that did not require her to give a verbal answer, which made the computation much easier for her. “Yes” or “no” was binary, like tossing a coin.

She bent down and wrapped two floppy arms around Nocko.

Good, I said to myself silently. I know you crave to be hugged.

ALAN (1)

During the last days of his life, Alan Turing created a machine capable of conversing with people. He named it “Christopher.”

Operating Christopher was a simple matter. The interlocutor typed what they wished to say on a typewriter, and simultaneously, mechanisms connected to the keys punched patterns of holes into a paper tape that was then fed into the machine. After computation, the machine gave its answer, which was converted by mechanisms connected to another typewriter back into English letters. Both typewriters had been modified to encode the output in a predetermined, systematic manner, e.g., “A” was replaced by “S,” and “S” was replaced by “M,” and so forth. For Turing, who had broken the Enigma code of the Third Reich, this seemed nothing more than a small linguistic game in his mystery-filled life.

No one ever saw the machine. After Turing’s death, he left behind two boxes of the records of the conversations he had held with Christopher. The wrinkled sheets of paper were jumbled together in no apparent order, and it was at first impossible for anyone to decipher the content of the conversations.

In 1982, an Oxford mathematician, Andrew Hodges, who was also Turing’s biographer, attempted to break the code. However, since the encryption code used for each conversation was different, and the pages weren’t numbered or marked with the date, the difficulty of decryption was greatly increased. Hodges discovered some clues and left notes, but failed to decipher the contents.

Thirty years later, to commemorate the one hundredth anniversary of Turing’s birth, a few MIT students decided to take up the challenge. Initially, they tried to brute force a solution by having the computer analyze every possible set of patterns on every page, but this required enormous resources. In this process, a woman named Joan Newman observed the original typescript closely and discovered subtle differences in the abrasion patterns of keys against paper on different pages. Taking this as a sign that the typescript was produced by two different typewriters, Newman came up with the bold hypothesis that the typescript represented a conversation between Turing and another interlocutor conducted in code.

These clues easily led many to think of the famous Turing test. But the students initially refused to believe that it was possible, in the 1950s, for anyone to create a computer program capable of holding a conversation with a person, even if the programmer was Alan Turing himself. They designated the hypothetical interlocutor “Spirit” and made up a series of absurd legends around it.