“Well what?”
Dome took a puff, held the flame close to the end of the cigar again. It licked at the ash, then smoke curled away from it. He took the cigar out of his mouth, well aware of the ritual aspects of its lighting. “Well, what can you tell me about HARLIE?”
“I’ve spoken to him.”
“And what did he have to say for himself?”
“You’ve seen the duplicate printouts, haven’t you?”
“I’ve seen them,” Dome said. He was a big man, leather and mahogany like his office. “I want to know what they mean. Your discussion yesterday about sensory modes and alienation was fascinating — but what’s he really thinking about? You’re the psychologist.”
“Well, first off, he’s a child.”
“You’ve mentioned that before.”
“Well, that’s how he reacts to things. He likes to play word games. I think, though, that he’s seriously interested in working for the company.”
“Oh? I thought he said the company could go to hell.”
“He was being flippant. He doesn’t like to be thought of as a piece of property.”
Dome grunted, laid his cigar down, picked up a flimsy and glanced at the few sentences written there. “What I want to know is this — can HARLIE actually do anything that’s worth money to us? I mean something that a so-called ‘finger-counter’ can’t do.”
“I believe so.” Auberson was noncommittal. Dome was leading up to something, that was for sure.
“For your sake, I hope he can.” Dome laid the flimsy aside and picked up his cigar again. Carefully he removed the ash by touching the side of it to a crystal ash tray. “He costs three times as much as a comparable-sized ‘finger-counter.’ ”
“Prototypes always cost more.”
“Even allowing for that. Judgment modules are expensive. A self-programming computer may be the ultimate answer, but if it’s priced beyond the market — we might just as well not bother.”
“Of course,” agreed Auberson. “But the problem wasn’t as simple as we thought it was — or let’s say that we didn’t fully understand the conditions of it when we began. We wanted to eliminate the programming step by allowing the computer to program itself; but we had to go considerably beyond that. A self-programming, problem-solving device has to be as flexible and creative as a human being — so you might as well build a human being. There’s no way at all to make a programming computer that’s as cheap as hiring a comparably trained technician. At least, not at the present state of the art. Anyone who tried would just end up with another HARLIE. You have to keep adding more and more judgment units to give it the flexibility and creativity it needs.”
“And the law of diminishing returns will defeat you in the end,” said Dome. “If it hasn’t already. HARLIE’s going to have to be able to do a hell of a lot to be worth the company’s continued investment.” His sharp eyes fixed the psychologist where he sat.
This is it, thought Auberson. This is where he pulls the knife.
“I’m concerned about something you said yesterday at the meeting.”
“Oh?” He kept his voice flat.
“Mm, yes. This thing about turning HARLIE off — would you honestly bring murder charges against the company?”
“Huh?” For a moment, Auberson was confused. “I was just tossing that off. I wasn’t seriously considering it. Not then.”
“I hope not. I’ve spent all morning in conference with Chang, just on this one subject.” Chang was one of the company’s lawyers, a brilliant student of national and international business law. “Whether you know it or not, you brought up a point that we’re going to have to cover. Is HARLIE a legal human being or not? Any kind of lawsuit might establish a dangerous legal precedent. What if it turned out he was human?”
“He already is,” said Auberson. “I thought we established that.”
“I mean, legally human.”
Auberson was cautiously silent.
Dome continued. “For one thing, we’d be stuck with him whether he was profitable or not. We’d never be able to turn him off. Ever.”
“He’d be effectively immortal…” Auberson mused.
“Do you know how much he’s costing us now?”
The psychologist’s answer held a hint of sarcasm, “I have a vague idea.”
“Almost six and a half million dollars per year.”
“Huh? That can’t be.”
“It can and is. Even amortizing the initial seventeen million dollar investment over the next thirty years doesn’t make a dent in his annual cost. There’s his maintenance as well as the research loss due to the drain he’s causing on our other projects.”
“That’s not fair — adding in the cost of other projects’ delays.”
“It is fair. If you were still on the robotic law feasibility project, we’d have completed it by now.”
“Hah! That one’s a dead end. HARLIE’s existence proves it.”
“True, but we might have realized it earlier. And cheaper. Every project we have has to be weighed against every other.” Dome puffed at his cigar. The air was heavy with its smoke. “Anyway, we’re off the track. We can’t allow that danger, that HARLIE is a legal human being. We can’t even afford to be taken to court on this — we’d have to disclose our schematics — which would be just what our competitors want. And that’s a human schematic, isn’t it? The court would be asked to determine just what it is that makes a human being. If they decide it’s his mental ability or brain pattern — well, I’m sure DataCo or InterBem would just love to tie us up with a few lawsuits, the kind that drag on for years — anything to keep us from producing judgment circuits. Do you want to be sued for slaveholding?”
“I think you’re worrying about a longshot,” Auberson scoffed.
“That’s my job. I’m responsible to the stockholders of this corporation. I have to protect their investment. Right now I’m acting President, and I’m concerned about a six and a half million dollar bite on my budget.” Dome had been acting President for six months now — the Board of Directors couldn’t agree on any one person long enough to hire him. And besides, the rumor went, they were just as happy to run the company themselves — which was one of the reasons why the HARLIE project was in trouble. HARLIE had been authorized by a far-sighted president and approved by a far more liberal board of directors than the present one. Now, less than three years later, the inheritors of the project were having doubts. The market had changed, they said — conditions were different, competition was stiff, and there wasn’t enough money to finance this kind of research. What they really meant was, “It wasn’t our idea, so why should we have to pay the dues on it?”
Dome was saying, “If the other companies found out what we were trying to do with HARLIE, we’d lose all advantage in building him. The legal considerations alone are terrifying. For instance, if he were somehow declared legally human, he would be an annual bite on the budget with no way to discontinue it short of murder. The possibility exists for a permanent financial drain on this company that would effectively stifle all future growth potential of this division. Hell, it would destroy this division. We might have to take a bath on the HARLIE project, but it would be preferable to the financial shackles that could be put on us. We have to be prepared for the possibility. There’s two things we can do about it. One—” he ticked off on his finger — “we can turn him off now.”
Auberson started to protest, but Dome cut him off. “Hear me out, Auberson. I know all the reasons why we want to continue the HARLIE project — but let’s consider the other side. Two—” he ticked off another finger — “we get some kind of guarantee now that HARLIE is not legally human.”
Auberson stared in disbelief. “You really are taking this seriously, aren’t you?”
“Shouldn’t I? You know a corporation is a legal individual, don’t you? And a corporation only exists on paper. Compare that with HARLIE. It wouldn’t be that hard to prove he’s human, would it?”