"Life is risk, Victor. I'm keeping the best stuff for myself. Not because I intend to use it, but because if I ever needed it badly and didn't have it, I'd feel like such a fool."
She cocked her head and narrowed her eyes, which made them practically disappear.
"Tell me something, Yank. Kluge picked you out of all your neighbors because you'd been a Boy Scout for thirty years. How do you react to what I'm doing?"
"You're cheerfully amoral, and you're a survivor, and you're basically decent. And I pity anybody who gets in your way."
She grinned, stretched, and stood up.
" 'Cheerfully amoral.' I like that." She sat beside me, making a great sloshing in the bed. "You want to be amoral again?"
"In a little bit." She started rubbing my chest. "So you got into computers because they were the wave of the future. Don't you ever worry about them… I don't know, I guess it sounds corny… do you think they'll take over?"
"Everybody thinks that until they start to use them," she said. "You've got to realize just how stupid they are. Without programming they are good for nothing, literally. Now, what I do believe is that the people who run the computers will take over. They already have. That's why I study them."
"I guess that's not what I meant. Maybe I can't say it right."
She frowned. "Kluge was looking into something. He'd been eavesdropping in artificial intelligence labs, and reading a lot of neurological research. I think he was trying to find a common thread."
"Between human brains and computers?"
"Not quite. He was thinking of computers and neurons. Brain cells." She pointed to her computer. "That thing, or any other computer, is light-years away from being a human brain. It can't generalize, or infer, or categorize, or invent. With good programming it can appear to do some of those things, but it's an illusion.
"There's an old speculation about what would happen if we finally built a computer with as many transistors as the human brain has neurons. Would there be a self-awareness? I think that's baloney. A transistor isn't a neuron, and a quintil-lion of them aren't any better than a dozen.
"So Kluge-who seems to have felt the same way-started looking into the possible similarities between a neuron and an 8-bit computer. That's why he had all that consumer junk sitting around his house, those Trash-80's and Atari's and TI's and Sinclair's, for chrissake. He was used to much more powerful instruments. He ate up the home units like candy."
"What did he find out?"
"Nothing, it looks like. An 8-bit unit is more complex than a neuron, and no computer is in the same galaxy as an organic brain. But see, the words get tricky. I said an Atari is more complex than a neuron, but it's hard to really compare them. It's like comparing a direction with a distance, or a color with a mass. The units are different. Except for one similarity."
"What's that?"
"The connections. Again, it's different, but the concept of networking is the same. A neuron is connected to a lot of others. There are trillions of them, and the way messages pulse through them determines what we are and what we think and what we remember. And with that computer I can reach a million others. It's bigger than the human brain, really, because the information in that network is more than all humanity could cope with in a million years. It reaches from Pioneer Ten, out beyond the orbit of Pluto, right into every living room that has a telephone in it. With that computer you can tap tons of data that has been collected but nobody's even had the time to look at.
"That's what Kluge was interested in. The old 'critical mass computer' idea, the computer that becomes aware, but with a new angle. Maybe it wouldn't be the size of the computer, but the number of computers. There used to be thousands of them. Now there's millions. They're putting them in cars. In wristwatches. Every home has several, from the simple timer on a microwave oven up to a video game or home terminal. Kluge was trying to find out if critical mass could be reached that way."
"What did he think?"
"I don't know. He was just getting started." She glanced down at me. "But you know what, Yank? I think you've reached critical mass while I wasn't looking."
"I think you're right." I reached for her.
Lisa liked to cuddle. I didn't, at first, after fifty years of sleeping alone. But I got to like it pretty quickly.
That's what we were doing when we resumed the conversation we had been having. We just lay in each other's arms and talked about things. Nobody had mentioned love yet, but I knew I loved her. I didn't know what to do about it, but I would think of something.
"Critical mass," I said. She nuzzled my neck, and yawned.
"What about it?"
"What would it be like? It seems like it would be such a vast intelligence. So quick, so omniscient. God-like."
"Could be."
"Wouldn't it… run our lives? I guess I'm asking the same questions I started off with. Would it take over?"
She thought about it for a long time.
"I wonder if there would be anything to take over. I mean, why should it care? How could we figure what its concerns would be? Would it want to be worshipped, for instance? I doubt it. Would it want to 'rationalize all human behavior, to eliminate all emotion,' as I'm sure some sci-fi film computer must have told some damsel in distress in the 'fifties.
"You can use a word like awareness, but what does it mean? An amoeba must be aware. Plants probably are. There may be a level of awareness in a neuron. Even in an integrated circuit chip. We don't even know what our own awareness really is. We've never been able to shine a light on it, dissect it, figure out where it comes from or where it goes when we're dead. To apply human values to a thing like this hypothetical computer-net consciousness would be pretty stupid. But I don't see how it could interact with human awareness at all. It might not even notice us, any more than we notice cells in our bodies, or neutrinos passing through us, or the vibrations of the atoms in the air around us."
So she had to explain what a neutrino was. One thing I always provided her with was an ignorant audience. And after that, I pretty much forgot about our mythical hyper-computer.
"What about your Captain?" I asked, much later.
"Do you really want to know, Yank?" she mumbled, sleepily.
"I'm not afraid to know."
She sat up and reached for her cigarettes. I had come to know she sometimes smoked them in times of stress. She had told me she smoked after making love, but that first time had been the only time. The lighter flared in the dark. I heard her exhale.
"My Major, actually. He got a promotion. Do you want to know his name?"
"Lisa, I don't want to know any of it if you don't want to tell it. But if you do, what I want to know is did he stand by you."
"He didn't marry me, if that's what you mean. When he knew he had to go, he said he would, but I talked him out of it. Maybe it was the most noble thing I ever did. Maybe it was the most stupid.
"It's no accident I look Japanese. My grandmother was raped in '42 by a Jap soldier of the occupation. She was Chinese, living in Hanoi. My mother was born there. They went south after Dien Bien Phu. My grandmother died. My mother had it hard. Being Chinese was tough enough, but being half Chinese and half Japanese was worse. My father was half French and half Annamese. Another bad combination. I never knew him. But I'm sort of a capsule history of Vietnam."
The end of her cigarette glowed brighter once more.
"I've got one grandfather's face and the other grandfather's height. With tits by Goodyear. About all I missed was some American genes, but I was working on that for my children.
"When Saigon was falling I tried to get to the American Embassy. Didn't make it. You know the rest, until I got to Thailand, and when I finally got Americans to notice me, it turned out my Major was still looking for me. He sponsored me over here, and I made it in time to watch him die of cancer. Two months I had with him, all of it in the hospital."