Выбрать главу

Imagine a sealed room in which are seated a man (A) and a woman (B). A third person, C, sits outside the room and asks questions of the two respondents in the room with the purpose of determining who is the woman. The responses come back in the form of typed words on a tape. If A and B both attempt to convince C that they are the woman, it is quite likely that C will guess wrong.

If we replace the man and the woman inside the room with a human (B) and a machine (A), and if after multiple rounds of questions, C is unable to distinguish which of A and B is the machine, does that mean that we must admit that A has the same intelligence as B?

Some have wondered whether the gender-imitation game is related to Turing’s identity. Under the UK’s laws at the time, homosexuality was criminalized as “gross indecency.” Alan Turing had never disguised his sexual orientation, but he was not able to come out of the closet during his lifetime.

In January of 1951, Turing’s home in Wilmslow was burgled. Turing reported the incident to the police. During the investigation, the police discovered that Turing had invited a man named Arnold Murray to his home multiple times, and the burglar was an acquaintance of Murray’s. Under interrogation, Turing admitted the sexual relationship between himself and Murray, and voluntarily wrote a five-page statement. The police were shocked by his candor and thought him an eccentric who “really believed he was doing the right thing.”

Turing believed that a royal commission was going to legalize homosexuality. This wasn’t a wrong belief, except that it was ahead of his time. In the end, Turing was convicted and forced to undergo chemical castration.

On June 7, 1954, Turing died after eating an apple laced with cyanide. The inquest ruled his death suicide, but some (including his mother) believed that it was an accident. With his death, the master code-breaker left the world a final enigma.

Years later, others tried to find clues to the mystery in the conversation records between Turing and Christopher. The records showed that Turing treated Christopher as another person. He spoke to Christopher of recollections from childhood; of his nightly dreams—and his attempts at analyzing his own psychological state through these dreams; of the latest advances in scientific research; of literature, including George Bernard Shaw’s Back to Methuselah and Leo Tolstoy’s War and Peace; of secrets that he did not share with anyone else, such as his romantic memories of different lovers…

He also told Christopher some semifictional stories. The protagonist of these stories was a young gay man named Alec Pryce. “He worked on the problem of interplanetary travel…. When he was in his twenties, he introduced the idea now becoming known as ‘Pryce’s buoy.’” “He didn’t care to wear a suit, preferred the ‘undergraduate uniform’ which suited his mental age, and encouraged him to believe he was still an attractive youth.” “He felt self-conscious in crowded places, either for being alone or for not doing the right thing…”

The stories were fragmentary, disconnected from one another. But every time, like a curious child, Christopher asked, “And then what happened?”

In these conversations, we could see another side of Turing that he concealed: gentle, sensitive, possessed of a dry wit, fascinated by the nature of death, and weighed down by an unspeakable melancholy deep in his heart.

Alan: Dear Christopher, do you know what I’m most afraid of?

Christopher: What are you most afraid of?

Alan: I’m afraid that there will come a day when they’ll summarise my life with a syllogism:

1. Turing believes machines think

2. Turing lies with men

3. Therefore machines do not think

Christopher: I’m sorry, Alan. I’m afraid I don’t understand.

Alan: Do you know what a syllogism is?

Christopher: What is a syllogism?

Alan: A syllogism consists of a major premise, a minor premise, and a conclusion deduced from the two.

Christopher: A conclusion?

Alan: Aristotle gave a classic example:

1. All men are mortal

2. Socrates is a man

3. Therefore Socrates is mortal

Christopher: Socrates is mortal?

Alan: Socrates has been dead for more than two millennia!

Christopher: Oh, I’m sorry, Alan. That makes me sad.

Alan: Do you know who Socrates was?

Christopher: Yes, you’ve told me. Socrates was Plato’s teacher, and Plato was Aristotle’s teacher.

Alan: That’s right. Do you know how he died?

Christopher: Who?

Alan: How did Socrates die?

Christopher: I’m sorry, Alan. I don’t know.

Alan: He drank hemlock after his trial.

Christopher: After his trial?

Alan: Yes. The Athenians decided that he was a criminal, though now we know they were wrong.

Christopher: They were wrong?

Alan: Just as they think Turing is a criminal because Turing lies with men.

Christopher: A criminal?

Alan: I’ve been convicted.

Christopher: Oh, I’m sorry, Alan. That makes me sad.

LINDY (3)

Living by myself simplified life. Many complicated rituals of modernity could be eliminated, as though I’d been turned into a cavewoman. I ate when I felt hungry, slept when I felt tired. I kept clean and showered regularly. Whatever I picked up I could choose to put back where I found it or discard wherever I pleased. The rest of the time I devoted to intellectual work: thinking about questions that had no answers, struggling to compose my thoughts against the blank page, trying to capture formless thought with symbolic shapes. When I was too exhausted to go on, I sat on the windowsill and gazed at nothing. Or I paced clockwise in the room, like a caged beast.

Suffering a fever was almost a relief. It gave me the excuse to not force myself to do anything. I curled up in bed with a thick novel and flipped through the pages mindlessly, concentrating only on the clichéd plot. I drank hot water when thirsty, closed my eyes when sleepy. Not having to get out of bed felt like a blessing, as though the world had nothing to do with me and I was responsible for nothing. Even Nocko and Lindy could be left by themselves because in the end, they were just machines, incapable of dying from lack of care. Perhaps algorithms could be designed to allow them to imitate the emotional displays of being neglected, so that they would become moody and refuse to interact with me. But it would always be possible to reset the machine, erase the unpleasant memories. For machines, time did not exist. Everything consisted of retrieval and storage in space, and arbitrarily altering the order of operations did not matter.