“Not God,” Auberson corrected. “G.O.D. The acronym is G.O.D. It means Graphic Omniscient Device.”
“I don’t care what the acronym is — you know as well as I what they’re going to call it.”
“The acronym was HARLIE’s suggestion, not mine.”
“It figures.” The Board Chairman pulled a cigar out of his humidor but didn’t light it.
“Well, why not?” said Auberson. “He designed it.”
“Is he planning to change his own name too? Computerized Human Robot, Integrating Simulated Thought?”
Auberson had heard the joke before. He didn’t laugh. “Considering what this new device is supposed to do — and HARLIE’s relationship to it — it might be appropriate.”
Dome was in the process of biting off the tip of his cigar when Auberson’s words caught him. Now he didn’t know whether to swallow the tip of it, which had lodged in his throat, or spit it out. An instinctive cough made the decision for him. Distastefully, he picked the knot of tobacco off his tongue and dropped it into an ash tray. “All right,” he said. “Tell me about the God Machine.”
Auberson was holding a HARLIE-printed summary in one hand, but he didn’t need it to answer this question. “It’s a model builder. It’s the ultimate model builder.”
“All computers are model builders,” said Dome. He was unimpressed.
“Right,” agreed Auberson, “but not to the extent this one will be. Look, a computer doesn’t actually solve problems — it builds models of them. Or rather, the programmer does. That’s what the programming is, the construction of the model and its conditions — then the machine manipulates the model to achieve a variety of situations and solutions. It’s up to us to interpret the results as a solution to the original problem. The only limit to the size of the problem is the size model the computer can handle. Theoretically, a computer could solve the world — if we could build a model big enough and a machine big enough to handle it.”
“If we could build that big a model, it would be duplicating the world.”
“In its memory banks, yes.”
“A computer with that capability would have to be as big as a planet.”
“Bigger,” said Auberson.
“Then, if you agree with me that it’s impossible, why bother me with this?” He slapped the sheaf of printouts on his desk.
“Because obviously HARLIE doesn’t think it’s impossible.”
Dome looked at him coldly. “You know as well as I that HARLIE is under a death sentence. He’s getting desperate to prove his worth so we won’t turn him off.”
Auberson pointed. “This is his proof.”
“Dammit, Aubie!” Dome exploded in frustration. “This thing is ridiculous! Have you looked at the projected costs of it? The financing charts? It would cost more to do than the total worth of the company.”
Auberson was adamant. “HARLIE still thinks it’s possible.”
“And that’s the most annoying thing of all, goddamnit! Every argument I can come up with is already refuted — in there!” Dorne gestured angrily. For the first time, Auberson noted an additional row of printouts stacked against one wall.
He resisted the urge to laugh. The man’s frustration was understandable. “The question,” Auberson said calmly, “is not whether this project is feasible — those printouts prove that it is — but whether or not we’re going to go ahead with it.”
“And that brings up something else,” said Dome. “I don’t remember authorizing this project. Who gave you the go-ahead to initiate such research?”
“You did — although not in so many words. What you said was that HARLIE had to prove his worth to the company. He had to come up with some way to make a profit. This is that way. This is the computer that you wanted HARLIE to be in the first place. This is the oracle that answers all questions to all men — all they have to do is meet its price.”
Dorne took his time about answering. He was lighting his cigar. He shook out the match and dropped it in the ash tray. “The price is too high,” he said.
“So are the profits,” Auberson answered. “Besides, no price is too high to pay for the right answer. Consider it — how much would the Democrats pay for a step-by-step plan telling them how to win the optimum number of votes in the next election? Or how much would Detroit pay to know every flaw in a transport design before they even built the first prototype? And how much would they pay for the corrected design — and variations thereof? How much would the mayor of New York City pay for a schematic showing him how to solve his three most pressing problems? How much might InterBem pay for a set of optimum exploitation procedures? How much would the Federal Government pay for a workable foreign policy? Consider the international applications — and the military ones as well.”
Dome grunted. “It would be one hell of a logistic weapon, wouldn’t it?”
“There’s an old saying: ‘Knowledge is power.’ There’s no price too high to pay for the right answer — not when you consider the alternatives. And we’d have the monopoly on the market — the only way this machine can be built is through the exclusive use of specially modified Mark IV judgment circuits.”
“Hm,” said Dome. He was considering. His cigar lay unnoticed in the ash tray. “It sounds attractive, all right, Aubie — but who’s going to program this thing?”
Auberson gestured at the printout “It’s right there in that schematic you’re holding.” At least, I hope it is. Damn! I wish HARLIE had explained this to me in more detail.
Dome paged through it slowly, scanning each fold of the seemingly endless document in turn. “You might be right about a computer being big enough to solve the world, Aubie, but I don’t see how.” He turned another page. “I’m sure the programming will hang you up. One of the reasons that current computers are limited to the size models they are is the law of diminishing returns. Above a certain size, programming reaches such complexity that it becomes a bigger problem than the problem itself.”
“Keep looking,” said Auberson. “It’s there.”
“Ah, here we are.” Dome laid the printout flat on his desk and began reading. A thoughtful frown creased his brow, and he pursed his lips in concentration. “It looks like HARLIE’s input units,” he said, then looked again. “No, it looks like HARLIE is the input unit.”
“That’s right.”
“Oh?” said Dome. “Would you like to explain that?”
How do I get into these things? Auberson found himself wondering. I’m only supposed to be a psychologist. Christ, I wish Handley were here. “Um, I’ll try — HARLIE will be linked up to the G.O.D. through a programming input translator. He’ll also be handling output the same way, translating it back into English for us. That translator is part of the self-programming unit.”
“If we’re building a self-programming unit, what do we need HARLIE for?”
“HARLIE is that self-programming unit. Remember, that’s the main reason he was built — to be a self-programming, problem-solving device.”
“Wait a minute,” interrupted Dome. “HARLIE is the result of our first JudgNaut Project. He was supposed to be a working unit, but wasn’t able to come up to it. Are you telling me that he can handle the JudgNaut functions after all?”
“No — he can’t. But he will be able to when this machine is built. The JudgNaut was this company’s first attempt at massive use of complex judgment circuitry in a large-scale computer. It was meant to be a self-programming device — and we found it couldn’t be built because there was no way to make it flexible enough to consider all the aspects of every program it-might be required to set up. So we built HARLIE — but he is not the JudgNaut, and that’s what all the confusion is about. HARLIE is more flexible, but in making him more flexible we had to apply more circuitry to each function. In doing that, we sacrificed a good portion of the range we hoped the machine would cover. HARLIE can write programs, yes — so can any human being — but not by the order of magnitude that the JudgNaut should have had, had we been able to build it.”