The day she went in to accept, I prepared a feast. I made decorations. Funny little signs with cartoons of C. on a pyramid of brokers, cracking a whip. Hand-lettered posters reading "Book That Cruise" and "Retirement by 35."
I could tell, watching her come up the courtyard, that celebration was a horrible mistake. She pounded up the stairs, slammed the door, and held it shut behind her with all her hundred and five pounds, sobbing.
"Beauie, we need to get out of this place."
I tried to hold her, proffer all the worthless comforts. "Okay," I managed. "I'm game. Where to?"
The last place. Worse than I expected. "I want to go back to U."
Imp B already pushed the envelope. B hadn't a clue what cats were, or sacks, let alone wives. But it seemed to know how to count them, or not count them, as the case demanded.
If A had been an exercise in verbal pattern recognition, B was a foray into computational linguistics. It knew things like over and under, right of or left of, inside or out. Even that far, I doubted whether it comprehended these containers or whether it just manipulated them cleverly enough to pass. Then again, I began to doubt whether I myself could define the difference.
B could handle syntax. It had a rudimentary sense of the parts of speech and how they operated on each other. And it began to cross the threshold into semantic content. Lentz once or twice tacked on a new subnet to handle different routines — a noun-phrase decoder or a short-term recognition scratch area. In fact, I suppose we were up to Imp B.4 or better.
Lentz assured me that B would handle its own knowledge representations. The frames, the inheritance of classification qualities and exceptions, the scripts: all would fall out as a result of the way B stored associated input. But even in its minute domains, B had to deal with numbingly different kinds of knowledges. With nouns alone, what you could do with "pattern" varied without limit from what you could do with "matching" or "machine." I'd lost count of the number of neurodes involved. It had grown big, complex beyond belief. A glitch now set us back whole days at a time. The thing was a monster, distributed, unchartable, out of control. And yet Lentz's brain, or mine, was hundreds of millions of Imp B's wide. We could push that matter a little longer, if only just.
We were still experimenting with the size of layers. Bigger was not always better, Lentz told me.
"Life, Marcel, in case you on the humanistic side of the tracks haven't grokked this yet, involves a series of trade-offs."
"Yes, we're onto that."
"Now. The trade-offs in input layer size. Well, the smaller the layer, the better it generalizes. The larger, the more it can learn to fit into an associative grid."
"The better it is at generalizing, the worse it is at acquiring new associations?"
"The poet's a blooming genius. Now we know how you earn your grant money."
"Does it follow that the more facts it has, the harder it is to take in new facts?"
"Thirty-five is about when that starts to happen, Marcel. You begin to think, 'Well, I more or less understand how things work. Do I really want to disassemble tens of thousands of tangled, semiaccurate beliefs on the off chance that I might be able to bring one small receptor field into better focus?' "
'Tell me about it. I'm there."
"Don't worry, little boy. You've a few tricks yet to pick through. And a few years yet to pick through them. I mean, you're lucky I've taken you on. Aristotle wouldn't accept any student still young enough to have a sex urge."
"Not a problem, at the moment."
On days when Lentz engaged himself with design philosophy, he grew expansive, almost pleasant. On days when he built things, he was fun. I tried to keep the soldering gun in his hand and ignore the baiting as much as possible.
"Now: what about the size of the hidden layers? Do you want them bigger or smaller than your input layers?"
"I'm sorry. I give up. We're going to have to turn all the cards over."
"Come on, think it out. Consider the translation impedance. Another trade-off. The better the resolution, the more susceptible the net becomes to random noise. Think of B as a curve fitter. ."
"That's all our brains are? Curve fitters?"
"It's a big 'all,' friend. The curve we are trying to fit is as long as existence. As many dimensions. The fact that we can get the infinite data stream to cohere into lumps at all has turned men with as much native intelligence as your friend Plover into mystics."
"Here we go. Time to slander Harold."
"It's not slander. The man makes his claims public. 'Meanings extractable from a given linguistic configuration may be neither convergent, bounded, nor recursively enumerable.' Or some such rubbish. He seems to think that because 'context' is infinitely extensible, there can be no neurological calculus of interpretation."
"And you, Engineer?" I tweaked, waiting until he had his hands full of printed circuit and his head deep in the cage. "Do you think that because you are virtuous, there will be no more cakes and ale?"
"Now, Marcel. What in the hell is that supposed to mean?"
"Dunno. Let's ask Harold."
"Hn. I'd rather get back to the subject. Sometimes I think the human brain is just one long open parenthesis. So. Tell me. How big do you want your various output layers to be?"
"I guess that would depend on how big an answer we are expecting from any given subnet."
"Well done. You're getting cagey. We'll have you writing NSF proposals in no time." Lentz's sarcasms were mellowing with age. "Would you concede, then, that many of our output layers could consist, in theory, of a single neurode, since the cyborgs think that every quest can be rephrased as a series of yes-or-no questions?"
"I wish the lit critters would catch on to that."
"Yes. Handy, isn't it? Cleans up a lot."
"Engineer, can I ask you something? If you're not a mystic and you're not a cyborg, what in creation are you?"
" 'Creation' is a loaded word, Marcel. I guess I'm a lot of little delta rules running recurrently, evaluating and updating themselves."
'Tell me a different story. I'm not sure I like that one."
We were well into the millions of connections when B seized up for good. We'd made so many tortuous increments, we'd stopgapped so many glitches that I did not, at first, see this collapse as fatal.
"John is a brother of Jim's," I told it. B turned the fact into a stream of hieroglyphic vectors that changed its layout imperceptibly. "Who is Jim's brother?"
"John," Imp B replied. Reliant knight. Already it outperformed some aphasies.
"Who is Jim?"
"John's sister." That much was fine. I could live with that answer. In fact, it taught me a thing or two about my own presumptive matrix.
I continued, "John gives Jim apples. Who gets apples?"
"Jim gets apples."
"Jim is given the apples by whom?"
"Jim is given the apples by John."
"Jim eats an apple. The apple is sour. Jim throws the other apples away. Why does Jim throw the other apples away?"
At that point, B's cranking time became unendurable. It returned something like, "Jim throws the other apples away because the apples are given by John."
"No," I told it, or words to that effect. "Start again. Why?"
"Jim throws the apples away. She does not want them."
A marginally acceptable answer. Maybe insight hid away somewhere in that tacit implication. But maybe the damn thing was bluffing. Its vagueness depressed me: the slow tyro during story hour, doomed from birth to a career in food service.
"Why doesn't she want them?"
"She doesn't eat them. So she can't want them."
This alien proto-intelligence differed just enough from sense to make my head throb. Still, we lay within acceptable performance margins. I went on to torture it with, "Jim hits John. Why does Jim hit John?" B had one of its damning seizures. It cranked all afternoon, resetting itself, grabbing randomly at a thousand possible but skewed associations.