“This year’s, I suppose?”
“That’s right.” He said it proudly.
“You knew that its Thorsen Auto-Pilot was one of our units, didn’t you?” He didn’t wait for an answer — it was a rhetorical question. “It was made possible by the variable-path circuits that we’ve been producing for the past four years and marketing as the Mark IV. Basically, that’s a simplified version of one type of HARLIE function module.”
“You mean HARLIE’s a giant judgment circuit?”
“HARLIE is a human brain — with solid-state circuitry instead of organic nerves. We use the judgment circuits to duplicate the human functions. The important part of the human brain is actually a series of very complex judgment paths. They don’t work exactly the same as HARLIE’s, but close enough. The difference is in mechanisms, not basic principles. If a nerve impulse is strong enough, it can trigger other nerves around it; the number of nerves reporting allows the brain to interpret the strength of the original stimulus. HARLIE’s circuits work the same way. The strength of the ‘yes’ impulses (or ‘on’ circuits) determines the interpretation. Just for HARLIE to complete one thought involves several thousand compacted judgment boxes.”
“Uh, what stage of compaction are HARLIE’s judgment boxes?” Clintwood again.
“It’s adjustable, depending on the precision HARLIE wants to bring to any one problem. Or needs to. It’s a matter of how many times a decision can be subdivided before such precision becomes redundant. He has a judgment unit to control it.”
Clintwood nodded and scratched something on his notepad.
Elzer remained unimpressed. “It’s still a computer, isn’t it?”
Auberson looked at him, frustrated by the man’s inability to understand. “Yes, in the same sense that your brain is equivalent to a toad’s.”
The reaction was immediate, a chorus of disapproving remarks. One voice, Dome’s, louder than the rest, kept insisting, “Here now!-Here now! We’ll have quiet.” As the noise subsided, he continued. “Auberson, if you can’t keep your personal opinions out of this—”
“Mr. Dome — Chairman Dome — I did not mean the comment as an insult to Mr. Elzer. I was assuming that Mr. Elzer’s brain was better, more complex than a toad’s. Assuming that he has an average human brain, he is as far above a toad as HARLIE is above a simplified autopilot judgment circuit.”
The room quieted somewhat. “However,” Auberson went on, “if Mr. Elzer feels that there is not enough difference between his brain and that of a toad, I’ll have to use some other comparison — hopefully one not so open to misinterpretation. Did you get all that, Miss Stimson?”
Miss Stimson, the Executive Secretary, looked up at him, eyes twinkling. She had gotten it.
“There is a significant difference that I might note,” he added, spacing out his words carefully. “HARLIE uses all of his brain…” Auberson waited to see if Elzer would rise to this; he didn’t. “Estimates vary, but we figure that the average human being uses only ten to fifteen percent of his available brain cells. We couldn’t afford that kind of luxury with HARLIE, so we built him to use his total brain capacity. He’s not as complex as a human brain — he has nowhere near the same number of “cells,” — but he can still function quite well at human levels. Building HARLIE taught us quite a bit about the workings of the human brain. In fact, we were surprised to find out that in many ways it’s simpler than we thought it was.
“HARLIE’s the result of a very foresighted decision made several years ago to explore the possibilities of judgment circuitry as thoroughly as possible. I’m sure I don’t have to comment on the wisdom of that decision. An on-off circuit can’t do the things a variable pattern can. It’s only the Mark IV unit that’s given us a serious piece of the computer market. That’s why we have to keep pushing. If we ever want to catch up with IBM — and such a thing is not impossible — if we ever want to catch up, we need to be the front-runner in judgment circuits. We have to continue with the HARLIE project.”
“Why?” asked Elzer. “Certainly we can continue producing judgment circuits without HARLIE.”
“We can — but that’s the sure and certain road to corporate oblivion. Look, the Thorsen Auto-Pilot is a fine little unit; it can’t be disparaged. But it’s only the equivalent of an IBM Pixie Desktop Calculator. It isn’t any more complex than that. If we want to catch up, we have to go after their JuggerNaut Series. That’s what HARLIE was originally supposed to be — the ultimate in self-programming computers.
“When Handley came on the project, though, its direction changed; the goal became even more lofty. Or maybe I should say, the way to achieve the goal involved an even greater challenge than we had originally thought. Look, you have to understand what Don was up to before he came here. He’d been doing research with a neuro-psychology team down in Houston; they’d been diagramming the basic pattern structures of the human brain. Have you ever seen the schematic of a thought? Don has. Do you know how to program a human brain? Don does. That’s what he was working on before he came here. Anyway, when they started to design HARLIE — he was called JudgNaut One then — Handley was struck by the similarity of the schematics to those of the human brain. The basic judgment paths were too much alike for the thought patterns not to be similar.
“Because the basic structures were so similar in function, Handley felt — and Digby concurred with him — that what they were building was indeed a human brain. Electronic parts, if you will, but undeniably human. Once that was realized, they worked specifically toward that end. Don sent to Houston for his notes, and soon they had a basic schematic of the total machine they wanted. They called it HARLIE and it was to be a self-programming, problem-solving device.”
“You say, ‘it was to be,’ ” said Elzer. “Isn’t it?”
“It is and it isn’t. It isn’t what the JudgNaut was supposed to be, no. But a human brain is a self-programming, problem-solving device — so they did meet the specifications of the original problem.”
“And what were you hired for? To be its baby-sitter?”
“To be its mentor. His mentor,” he corrected.
“Same thing,” snorted Elzer.
“I was brought onto the project as soon as it was realized that HARLIE would be human. Don and I worked together to plan his programming. Don was concerned with how he would be programmed — I was concerned with what.”
“Sort of a mechanical godfather,” said Elzer.
“If you will. Somebody had to guide HARLIE and plan for his education. At the same time, we’re learning quite a bit about human and mechanical psychologies. By the time HARLIE went operational, I thought I had a year’s worth of lesson plans to work with. He went through them in three months, and ever since we’ve been trying to catch up. HARLIE has no trouble at all with rote work; it’s when we get to the human stuff that we start bogging down. I don’t know whether we’re losing him or he’s losing us.”
“If you don’t know what you’re doing,” interrupted Elzer, “then how did you ever get to be in charge of the project?”
Auberson decided to ignore that. “When Digby died it was a choice between myself and Handley. We flipped a coin because it didn’t make much difference to either of us. I lost.”
His flippancy was wasted on Elzer. “You mean you don’t want the job?”
Auberson could see what was coming. But he said, “Not exactly. It’s just that there’s so damn much busy work that it keeps me away from my real job — HARLIE.”
Elzer pounced on it anyway. “You see,” he said to the rest of the Board. “This proves my point. We have a man in charge of this project who doesn’t even care about it.”
Auberson was on his feet at that. Dome was saying, “Oh, now wait a minute—”
“When we lost Digby we should have closed it down,” Elzer insisted. “All we have left are Indians and no Chief.”