129485.fb2
WHAT ELSE?
HOW MUCH OF A PROFIT DO I HAVE TO SHOW?
YOUR COST, PLUS TEN PERCENT.
ONLY TEN PERCENT?
IF YOU CAN DO MORE, THEN DO IT.
HMM.
STUMPED?
NO. JUST THINKING.
HOW MUCH TIME DO YOU NEED?
I DON’T KNOW. AS LONG AS IT TAKES.
ALL RIGHT.
Dome said, “Sit down, Auberson.”
Auberson sat. The padded leather cushions gave beneath his weight. Dome paused to light his cigar, then stared across the wide expanse of mahogany at the psychologist. “Well?” he said.
“Well what?”
Dome took a puff, held the flame close to the end of the cigar again. It licked at the ash, then smoke curled away from it. He took the cigar out of his mouth, well aware of the ritual aspects of its lighting. “Well, what can you tell me about HARLIE?”
“I’ve spoken to him.”
“And what did he have to say for himself?”
“You’ve seen the duplicate printouts, haven’t you?”
“I’ve seen them,” Dome said. He was a big man, leather and mahogany like his office. “I want to know what they mean. Your discussion yesterday about sensory modes and alienation was fascinating — but what’s he really thinking about? You’re the psychologist.”
“Well, first off, he’s a child.”
“You’ve mentioned that before.”
“Well, that’s how he reacts to things. He likes to play word games. I think, though, that he’s seriously interested in working for the company.”
“Oh? I thought he said the company could go to hell.”
“He was being flippant. He doesn’t like to be thought of as a piece of property.”
Dome grunted, laid his cigar down, picked up a flimsy and glanced at the few sentences written there. “What I want to know is this — can HARLIE actually do anything that’s worth money to us? I mean something that a so-called ‘finger-counter’ can’t do.”
“I believe so.” Auberson was noncommittal. Dome was leading up to something, that was for sure.
“For your sake, I hope he can.” Dome laid the flimsy aside and picked up his cigar again. Carefully he removed the ash by touching the side of it to a crystal ash tray. “He costs three times as much as a comparable-sized ‘finger-counter.’ ”
“Prototypes always cost more.”
“Even allowing for that. Judgment modules are expensive. A self-programming computer may be the ultimate answer, but if it’s priced beyond the market — we might just as well not bother.”
“Of course,” agreed Auberson. “But the problem wasn’t as simple as we thought it was — or let’s say that we didn’t fully understand the conditions of it when we began. We wanted to eliminate the programming step by allowing the computer to program itself; but we had to go considerably beyond that. A self-programming, problem-solving device has to be as flexible and creative as a human being — so you might as well build a human being. There’s no way at all to make a programming computer that’s as cheap as hiring a comparably trained technician. At least, not at the present state of the art. Anyone who tried would just end up with another HARLIE. You have to keep adding more and more judgment units to give it the flexibility and creativity it needs.”
“And the law of diminishing returns will defeat you in the end,” said Dome. “If it hasn’t already. HARLIE’s going to have to be able to do a hell of a lot to be worth the company’s continued investment.” His sharp eyes fixed the psychologist where he sat.
This is it, thought Auberson. This is where he pulls the knife.
“I’m concerned about something you said yesterday at the meeting.”
“Oh?” He kept his voice flat.
“Mm, yes. This thing about turning HARLIE off — would you honestly bring murder charges against the company?”
“Huh?” For a moment, Auberson was confused. “I was just tossing that off. I wasn’t seriously considering it. Not then.”
“I hope not. I’ve spent all morning in conference with Chang, just on this one subject.” Chang was one of the company’s lawyers, a brilliant student of national and international business law. “Whether you know it or not, you brought up a point that we’re going to have to cover. Is HARLIE a legal human being or not? Any kind of lawsuit might establish a dangerous legal precedent. What if it turned out he was human?”
“He already is,” said Auberson. “I thought we established that.”
“I mean, legally human.”
Auberson was cautiously silent.
Dome continued. “For one thing, we’d be stuck with him whether he was profitable or not. We’d never be able to turn him off. Ever.”
“He’d be effectively immortal…” Auberson mused.
“Do you know how much he’s costing us now?”
The psychologist’s answer held a hint of sarcasm, “I have a vague idea.”
“Almost six and a half million dollars per year.”
“Huh? That can’t be.”
“It can and is. Even amortizing the initial seventeen million dollar investment over the next thirty years doesn’t make a dent in his annual cost. There’s his maintenance as well as the research loss due to the drain he’s causing on our other projects.”
“That’s not fair — adding in the cost of other projects’ delays.”
“It is fair. If you were still on the robotic law feasibility project, we’d have completed it by now.”