129485.fb2
Auberson was adamant. “HARLIE still thinks it’s possible.”
“And that’s the most annoying thing of all, goddamnit! Every argument I can come up with is already refuted — in there!” Dorne gestured angrily. For the first time, Auberson noted an additional row of printouts stacked against one wall.
He resisted the urge to laugh. The man’s frustration was understandable. “The question,” Auberson said calmly, “is not whether this project is feasible — those printouts prove that it is — but whether or not we’re going to go ahead with it.”
“And that brings up something else,” said Dome. “I don’t remember authorizing this project. Who gave you the go-ahead to initiate such research?”
“You did — although not in so many words. What you said was that HARLIE had to prove his worth to the company. He had to come up with some way to make a profit. This is that way. This is the computer that you wanted HARLIE to be in the first place. This is the oracle that answers all questions to all men — all they have to do is meet its price.”
Dorne took his time about answering. He was lighting his cigar. He shook out the match and dropped it in the ash tray. “The price is too high,” he said.
“So are the profits,” Auberson answered. “Besides, no price is too high to pay for the right answer. Consider it — how much would the Democrats pay for a step-by-step plan telling them how to win the optimum number of votes in the next election? Or how much would Detroit pay to know every flaw in a transport design before they even built the first prototype? And how much would they pay for the corrected design — and variations thereof? How much would the mayor of New York City pay for a schematic showing him how to solve his three most pressing problems? How much might InterBem pay for a set of optimum exploitation procedures? How much would the Federal Government pay for a workable foreign policy? Consider the international applications — and the military ones as well.”
Dome grunted. “It would be one hell of a logistic weapon, wouldn’t it?”
“There’s an old saying: ‘Knowledge is power.’ There’s no price too high to pay for the right answer — not when you consider the alternatives. And we’d have the monopoly on the market — the only way this machine can be built is through the exclusive use of specially modified Mark IV judgment circuits.”
“Hm,” said Dome. He was considering. His cigar lay unnoticed in the ash tray. “It sounds attractive, all right, Aubie — but who’s going to program this thing?”
Auberson gestured at the printout “It’s right there in that schematic you’re holding.” At least, I hope it is. Damn! I wish HARLIE had explained this to me in more detail.
Dome paged through it slowly, scanning each fold of the seemingly endless document in turn. “You might be right about a computer being big enough to solve the world, Aubie, but I don’t see how.” He turned another page. “I’m sure the programming will hang you up. One of the reasons that current computers are limited to the size models they are is the law of diminishing returns. Above a certain size, programming reaches such complexity that it becomes a bigger problem than the problem itself.”
“Keep looking,” said Auberson. “It’s there.”
“Ah, here we are.” Dome laid the printout flat on his desk and began reading. A thoughtful frown creased his brow, and he pursed his lips in concentration. “It looks like HARLIE’s input units,” he said, then looked again. “No, it looks like HARLIE is the input unit.”
“That’s right.”
“Oh?” said Dome. “Would you like to explain that?”
How do I get into these things? Auberson found himself wondering. I’m only supposed to be a psychologist. Christ, I wish Handley were here. “Um, I’ll try — HARLIE will be linked up to the G.O.D. through a programming input translator. He’ll also be handling output the same way, translating it back into English for us. That translator is part of the self-programming unit.”
“If we’re building a self-programming unit, what do we need HARLIE for?”
“HARLIE is that self-programming unit. Remember, that’s the main reason he was built — to be a self-programming, problem-solving device.”
“Wait a minute,” interrupted Dome. “HARLIE is the result of our first JudgNaut Project. He was supposed to be a working unit, but wasn’t able to come up to it. Are you telling me that he can handle the JudgNaut functions after all?”
“No — he can’t. But he will be able to when this machine is built. The JudgNaut was this company’s first attempt at massive use of complex judgment circuitry in a large-scale computer. It was meant to be a self-programming device — and we found it couldn’t be built because there was no way to make it flexible enough to consider all the aspects of every program it-might be required to set up. So we built HARLIE — but he is not the JudgNaut, and that’s what all the confusion is about. HARLIE is more flexible, but in making him more flexible we had to apply more circuitry to each function. In doing that, we sacrificed a good portion of the range we hoped the machine would cover. HARLIE can write programs, yes — so can any human being — but not by the order of magnitude that the JudgNaut should have had, had we been able to build it.”
“And that’s one of my biggest gripes,” put in Dome. “That the JudgNaut Project was subverted into HARLIE — which can’t show a profit.”
“But he can — and will. For one thing, HARLIE is genuinely creative. He knows that this company wants to market a large-scale program-writing computer. HARLIE isn’t that computer, but he knows how to give himself that capability. And that’s what you want, isn’t it?”
Auberson didn’t wait for Dome’s grudging assent He went right on. “HARLIE isn’t just satisfied with meeting the specifications of the original problem — he wants to surpass them. All you want is a device which can set up and solve models within a limited range. HARLIE wants a device which can set up and solve any size model.”
“And HARLIE’s going to program this machine, right?”
“Right.”
“How? You just finished telling me he wasn’t all that much better than a human programmer.”
“In grasp, no — but in speed and thoroughness, he can’t be matched. He has capabilities that a human doesn’t. For one thing, he’s faster. For another, he can write the program directly into the computer — and experience it as a part of himself as he writes it. He can’t make mistakes either. He’s limited to the size models that human programmers can construct for much the same reasons they are: His brain functions aren’t big enough to handle more; HARLIE’s ego functions supercede much of the circuitry that would have been used for forebrain functions in the JudgNaut. But in this respect, HARLIE’s got an advantage over human programmers — he can increase the size of his forebrain functions. Or he will be able to with the G.O.D. He’ll program it by making it a part of himself — by becoming one with it — and using its capabilities to handle its own programming. He’ll be monitoring and experiencing the program as he writes it directly into the G.O.D. As the model is manipulated, HARLIE will be able to adapt the program to cover any situation possible. Their combined capabilities will be much more than the sum of their separate parts.”
“So why not just build these functions into the G.O.D. in the first place?”
“If we didn’t have HARLIE, we’d have to — but if we didn’t have HARLIE, we wouldn’t have the G.O.D. either. The G.O.D. is intended to be almost entirely fore-brain functions. We’ve already got the massive ego function which will control it, so why build a new one?”
“Hmp — massive ego is right.”
Auberson ignored it. “Basically, this G.O.D. machine is the rest of HARLIE’s brain. It’s the thought centers that a consciousness such as HARLIE’s should have access to. Take another look at those printouts. You see a thing called Programming Implementation?”
“Yes, what about it?”
“Well, that’s HARLIE’s vanity again. He doesn’t want to call it what it really is, but it’s an additional lobe for his brain. He’ll need a monitor unit to control each specific section of the G.O.D. Because the G.O.D. will have no practical limit — it can grow as big as we let it — HARLIE’s grasp will have to be increased proportionally. That’s what that unit does. As each lobe of the G.O.D. is completed, an equivalent monitoring lobe goes into Programming Implementation. Not only that: Because HARLIE is an electronic entity, his thoughts are already in computer language — it will be a maximum efficiency interface between himself and the G.O.D. He need only think of a program and it’ll be fact. It’s the most efficient function HARLIE could have.”
“I see,” said Dome. “And he planned it that way himself, right?”
Auberson nodded. “But it’s a natural. Look, a computer is very much like a mystic oracle. You not only have to know what questions to ask, but how to phrase them — and the answers are not always what you expect, nor necessarily in terms you can understand. Who better to use as a translator than someone who’s half-oracle and half-human?”
Dome ignored the comment; instead he mused aloud, continuing a previous train of thought. “A neat trick that, a neat trick. We tell him he’s got to come up with some way to be profitable, and he tells us to build a new machine that only he can program. I have the feeling that he did it on purpose — that this may be the only context in which HARLIE would be valuable. And of course, once we establish HARLIE’s worth to the project, that leaves us with the question: Is the total concept profitable? And that brings us back to where we started: Is HARLIE profitable?”
Auberson decided to ignore the latter question. He said, “HARLIE thinks the total concept is profitable. It’s in the printouts.”
“Ah, yes — but HARLIE’s got a vested interest in the project.”
“Why not?” said Auberson. “It’s his project, not mine. He’s the one who’s presenting it to the Board for approval.”
“And it’s sure to be voted down.” The Chairman looked at the back of his hand. “I can’t see any way that this will be approved. I’m not even sure we should being it up.”
“It’s too late,” said Auberson. “You’re going to have to bring it up. And you’re going to have to give it a fair hearing. You told HARLIE to come up with a way to be profitable. Now you’ve got to give him his chance to be heard.”
“This is ridiculous,” grumbled the other. “He’s only a machine.”
“You want to go through that argument again?” asked Auberson.
“No,” Dome shuddered. He still remembered the last time. “All right, I’ll have the Board consider it, Aubie, but the whole situation is unreal — having a computer design another computer which will give it a job. You know what Elzer is going to say, don’t you? You’d just better be prepared for defeat, that’s all.”
“Just give us the chance,” said Auberson. “Well take it from there.”
Dome half-nodded, half-shrugged. “Better start preparing your arguments now — you’ve only got a couple weeks.”
“Two and a half,” corrected Aubie, “and that’s more than enough time. We’ve got HARLIE on our side.” He was already out of his chair. As he closed the door behind him, Dome was again paging through the printouts and shaking his head.
Back in his own office, Auberson stared into his desk drawer, his hand hovering over a decision. At last he decided on the pills; he’d sworn off the grass, and he was going to stick to that.