The building superintendent wrote to me repeatedly to ask whether I needed an iVatar caretaker. How did he know I was sick? I had never met him, and he had never even set foot in the building. Instead, he spent his days sitting behind a desk somewhere, monitoring the conditions of residents in dozens of apartment buildings, taking care of unexpected problems that the smart home systems couldn’t deal with on their own. Did he even remember my name or what I looked like? I doubted it.
Still, I expressed my gratitude for his concern. In this age, everyone relied on others to live; even something as simple as calling for take-out required the services of thousands of workers from around the globe: taking the order by phone, paying electronically, maintaining various systems, processing the data, farming and manufacturing the raw ingredients, procuring and transporting, inspecting for food safety, cooking, scheduling, and finally dispatching the food by courier…. But most of the time, we never saw any of these people, giving each of us the illusion of living like Robinson Crusoe on a deserted island.
I enjoyed being alone, but I also treasured the kindness of strangers from beyond the island. After all, the apartment needed to be cleaned, and I was too ill to get out of bed, or at least I didn’t want to get out of bed.
When the caretaker arrived, I turned on the light-screen around my bed. From inside, I could see out, but anybody outside couldn’t see or hear me. The door opened, and an iVatar entered, gliding silently along on hidden wheels. A crude, cartoonish face with an empty smile was projected onto its smooth, egg-shaped head. I knew that behind the smile was a real person, perhaps someone with deep wrinkles on their face, or someone still young but with a downcast heart. In a distant service center I couldn’t see, thousands of workers wearing telepresence gloves and remote-sensing goggles were providing domestic services to people across the globe.
The iVatar looked around and began a preset routine: cleaning off the furniture, wiping off dust, taking out the trash, even watering the taro vine on the windowsill. I observed it from behind the light-screen. Its two arms were as nimble as a human’s, deftly picking up each teacup, rinsing it in the sink, setting it facedown on the drying rack.
I remembered a similar iVatar that had been in my family’s home many years ago, when my grandfather was still alive. Sometimes he would make the iVatar play chess with him, and because he was such a good player, he always won. Then he’d happily hum some tune while the iVatar stood by, a disheartened expression on its face. The sight always made me giggle.
I didn’t want to be troubled by sad memories while sick, so I turned to Lindy, who was sitting near the pillows. “Would you like me to read to you?”
Word by word, sentence by sentence, I read from the thick novel. I focused on filling space and time with my voice, careless of the meaning behind the words. After a while, I paused from thirst. The iVatar had already left. A single bowl covered by an upturned plate sat on the clean kitchen table.
I turned off the light-screen, got out of bed, and shuffled over to the table. Lifting the plate revealed a bowl of piping hot noodle soup. On top of the broth floated red tomato chunks, yellow egg wisps, green chopped scallions, and golden oil slicks. I drank a spoonful. The soup had been made with a lot of ginger, and the hot sensation flowed right from the tip of my tongue into my belly. A familiar taste from my childhood.
Tears spilled from my eyes; I was helpless to stop them.
I finished the bowl of noodle soup, crying the whole while.
ALAN (3)
On June 9, 1949, the renowned neurosurgeon Sir Geoffrey Jefferson delivered a speech titled “The Mind of Mechanical Man,” in which he made the following remarks against the idea that machines could think:
Not until a machine can write a sonnet or compose a concerto because of thoughts and emotions felt, and not by the chance fall of symbols, could we agree that machine equals brain—that is, not only write it but know that it had written it. No mechanism could feel (and not merely artificially signal, an easy contrivance) pleasure at its successes, grief when its valves fuse, be warmed by flattery, be made miserable by its mistakes, be charmed by sex, be angry or depressed when it cannot get what it wants.
This passage was often quoted, and the Shakespearean sonnet became a symbol, the brightest jewel in the crown of the human mind, a spiritual high ground unattainable by mere machines.
A reporter from The Times called Turing to ask for his thoughts on this speech. Turing, in his habitual, uninhibited manner, said, “I do not think you can even draw the line about sonnets, though the comparison is perhaps a little bit unfair because a sonnet written by a machine will be better appreciated by another machine.”
Turing always believed that there was no reason for machines to think the same way as humans, just as individual humans thought differently from each other. Some people were born blind; some could speak but could not read or write; some could not interpret the facial expressions of others; some spent their entire lives incapable of knowing what it meant to love another; but all of them deserved our respect and understanding. It was pointless to find fault with machines by starting with the premise that humans were supreme. It was more important to clarify, through the imitation game, how humans accomplished their complex cognitive tasks.
In Shaw’s Back to Methuselah, Pygmalion, a scientist of the year AD 31920, created a pair of robots, which inspired awe from all present.
ECRASIA: Cannot he do anything original?
PYGMALION: No. But then, you know, I do not admit that any of us can do anything really original, though Martellus thinks we can.
ACIS: Can he answer a question?
PYGMALION: Oh yes. A question is a stimulus, you know. Ask him one.
This was not unlike the kind of answer Turing would have given. But compared to Shaw, Turing’s prediction was far more optimistic. He believed that within fifty years, “it will be possible to programme computers, with a storage capacity of about 109, to make them play the imitation game so well that an average interrogator will not have more than 70 per cent chance of making the right identification after five minutes of questioning. The original question, ‘Can machines think?’ [will] be too meaningless to deserve discussion.”
In “Computing Machinery and Intelligence,” Turing attempted to answer Jefferson’s objection from the perspective of the imitation game. Suppose a machine could answer questions about sonnets like a human, does that mean it really “felt” poetry? He drafted the following hypothetical conversation:
Interrogator: In the first line of your sonnet which reads “Shall I compare thee to a summer’s day,” would not “a spring day” do as well or better?
Witness: It wouldn’t scan.
Interrogator: How about “a winter’s day.” That would scan all right.
Witness: Yes, but nobody wants to be compared to a winter’s day.