129485.fb2
There was laughter all around the table — roaring, good-natured laughter. It was the first light moment in four days of long, dry discussion.
He grinned, just a little bit embarrassed, but more with the triumph of realization. “Gentlemen,” he said. “What do I need to do to convince you that we have here the plans for the most important machine mankind will ever build? I’ve been giving you examples like feeding in all the information available about a specific company, say IBM, and letting the G.O.D. machine tell you what secret research programs that company is probably working on. Or doing the same thing for a government. I’ve been telling you about how this machine can predict the ecological effect of ten million units of a new type of automobile engine — but all of this is minor; these are lesser things. This machine literally will be a God!”
Handley looked at him, startled. Annie was suddenly ashen. “What in—?” The look on Annie’s face was the worst. It said volumes. What was going on? This was not what he had planned to say. He was supposed to be talking to them about profits and growth and piles of money, not religion.
“Gentlemen,” he continued, “we should build this machine not just because it will make us rich — oh, it will; it will make us all fabulously wealthy — but because ultimately it may help us to save humanity from itself. This is a Graphic Omniscient Device. Literally. It will know everything — and knowing everything, it will tell us what is right and what is wrong. It will tell us things about the human race we never knew before. It will tell us how to go to the planets and the stars. It will tell us how to make Earth a paradise. It will tell us how to be Gods ourselves. It will have infinite capacity, and we will have infinite knowledge. Knowledge is power, and infinite knowledge will be infinite power. We will find that the easiest and most profitable course of action to take will be the one that ultimately will be the best for the whole human race. We will have a machine that can and will answer the ultimate question.”
There was silence for a long time. Elzer was looking at him skeptically. Finally he said, “Auberson, I thought you had given up pot-smoking.”
And abruptly, he was deflated and down. The heady rush of euphoria at the realization of what the G.O.D. was, was gone. “Elzer,” he said, wavering on his feet, “you are a fool. The G.O.D. Machine is very dangerous to you, and I don’t blame you for being afraid of it. Once the G.O.D. is finished, there will be no need for you, Carl Elzer. The machine will replace you. It will take away your company and run it better than you can.
“You’re a fatuous person, you know that, Elzer? You are pompous and self-important, and much of what you do is solely for the sake of flattering your own ego at the expense of others. You seek power for its own sake, for self-gratification, regardless of what it might do to other human beings. You place property values higher than human rights, and for that reason, you are anti-human. That’s why you and the G.O.D. are on opposite sides. I cannot blame you for being afraid of it. You have recognized that the machine will be your enemy. It can make you rich — but the price of being rich might be more than you want to pay. It will mean you will have to stop wallowing like a self-important little hippopotamus. It will mean you will have to do things that will be against your nature and stop thinking solely in terms of yourself. I don’t think you’re strong enough to do it. I think you’ll take the easy way out and run from the total experience of the G.O.D. Machine. I can’t blame you for being weak, Elzer. I can only feel sorry for you — because you’re a greater fool than Judas.”
Elzer listened quietly to all of it. Dome started to say something, but Elzer stopped him. He said to Auberson, “Are you through?”
Auberson sat down slowly. “I believe so.” Elzer looked at him carefully, then said, “You know, I’ve never considered Judas a fool — at least, not in the sense you mean.” He paused, noted that the room was absolutely silent, then continued quite methodically. “The traditional version of the story has it that Judas betrayed Christ for thirty pieces of silver. I assume that’s the same thing you are accusing me of. Actually, I’ve always suspected that Judas was the most faithful of the apostles, and that his betrayal of Jesus was not a betrayal at all, simply a test to prove that Christ could not be betrayed. The way I see it, Judas hoped and expected that Christ would have worked some kind of miracle and turned away those soldiers when they came for him. Or perhaps he would not die on the cross. Or perhaps — well, never mind. In any case, he didn’t do any of these things, probably because he was not capable of it. You see, I’ve also always believed that Christ was not the son of God, but just a very very good man, and that he had no supernatural powers at all, just the abilities of any normal human being. When he died, that’s when Judas realized that he had not been testing God at all — merely betraying a human being, perhaps the best human being. Judas’s mistake was in wanting too much to believe in the powers of Christ. He wanted Christ to demonstrate to everyone that he was the son of God, and he believed his Christ could do it — only his Christ wasn’t the son of God and couldn’t do it, and he died. You see, it was Christ who betrayed Judas — by promising what he couldn’t deliver. And Judas realized what he had done and hung himself. That’s my interpretation of it, Auberson — not the traditional, I’ll agree, but it has more meaning to me. Judas’s mistake was in believing too hard and not questioning first what he thought were facts. I don’t intend to repeat that mistake.” He paused for a sip of water, then looked at Auberson again. His eyes were firm behind his glasses. “May I ask you one question?”
Auberson nodded.
“Will this machine work?”
“HARLIE says it will.”
“That’s the point, Auberson. HARLIE says it will. You won’t say it, Handley won’t say it — nobody but HARLIE will say it. HARLIE’s the only one who knows for sure — and according to you and Handley, HARLIE designed it.
“Look, before we invest any money in it, we need to know for sure. We can’t risk being wrong. Now, you’ve painted some very pretty pictures here today, this week, some very very pretty pictures. I admit it, I’d like to see them realized — I’m not quite the ghoul you think I am, although I think I can understand your reasons for feeling that way. Auberson, I’m not an evil man — at least, I don’t feel like an evil man. I’m willing to do what is right and what is best — if it can be shown to me that it is right and best. And I also have to be shown that I won’t destroy myself in the process, because if I did, then I wouldn’t be any good to anybody, least of all myself. I need to know that we can realize this dream — then I’ll support it, and not before then. You keep saying that HARLIE says this will work — but HARLIE has a vested interest in this machine. Do you think he might have fudged on the specifications?”
“No. HARLIE could not have made a mistake — at least, he would not have made a mistake intentionally.”
“That’s an interesting thing you suggest, Auberson. You said ‘not intentionally.’ What about unintentionally? We have no way to double-check HARLIE, do we? We have to take his word for it. If HARLIE works, then these specifications are correct. If HARLIE doesn’t work, then this proposal is probably wrong too. The only way we’ll find out will be to build the G.O.D. Machine and turn it on. And if HARLIE is wrong and these plans don’t work, then we’ll have destroyed ourselves completely, won’t we have?”
“I have faith in HARLIE.”
“I have faith in God,” said Hzer, “but I don’t depend on him to run my business.”
“God—? Oh, God. I thought you meant G.O.D. If we do build this machine, G.O.D. will be running your business — and better than you could. G.O.D. could build a model of our whole operation and weed out those areas in which the efficiency level was below profitability.”
“You’re pretty sure of this, aren’t you?”
“Yes, I am.”
“What do we do if you’re wrong too?”
“You want me to offer to pay you back?”
Elzer didn’t smile. “Let’s not be facetious. This thing started because we questioned HARLIE’s profitability, efficiency and purpose. Instead of proving himself, he went out and found religion — gave us a blueprint for a computer GOD. Fine — but all of this depends on whether or not HARLIE works. And that is the core of the matter. That still hasn’t been proven. That’s why I went down there on Monday — to see if HARLIE would speak to me. All I got was gibberish and some pseudo-Freudian attempt at analysis.”
“You weren’t any too polite to him yourself—”
“He’s a machine, Auberson — I don’t care if he does have emotions, or the mechanical equivalent. Or even if he does have a soul, like you claim. The point is, I presented myself to him to be convinced. Instead of making an honest attempt to convince me, he reacted like a spoiled child. That doesn’t indicate any kind of logical thinking to me. Auberson, I know you don’t like me, but you will have to admit that I could not have gotten to where I am today without some degree of financial know-how. Will you admit that?”
“I will.”
“Thank you. Then you must realize that I am looking out for the interests of the company that pays both our salaries. I tried to give your side a fair hearing. I hope you will do the same for me. Can you say without a doubt that. HARLIE is totally sane?”
Auberson started to open his mouth, then shut it He sat there and looked at Elzer and considered the question. I have known a lot of insane people in my life, some who were committed and some who should have been. The most dangerous is the insane man who knows that everyone is watching him for signs of insanity. He will be careful to conceal those signs from even those closest to him. HARLIE is smarter than any human being who has ever lived. But is he sane?
“Elzer,” he said, “I’m an optimist. I like to believe that things always work out for the best, even though sometimes I have to admit that they don’t. I’d like to believe that this program, HARLIE and the G.O.D., are for the best. But the only person who knows for sure is HARLIE. I’ve known HARLIE since he was a pair of transistors, you might say. I know him better than anyone. I trust him. Sometimes he scares me — I mean, it’s frightening to realize that my closest friend and confidant is not a human being but a machine. But I’m closer to my work than I am to any other human being — almost any other human being. I cannot help but trust HARLIE. I’m sorry that I have to put it in those terms, but that’s the way it is.”
Elzer was silent. The two men looked at each other a long time. Auberson realized that he no longer hated Elzer, merely felt a dull ache. Understanding nullifies hatred, but—
Dome was whispering something to Elzer. Elzer nodded, “Gentlemen of the Board, it’s getting late. We all want to go home and enjoy the weekend. Both Carl and I think we should postpone the voting on this until Monday. That way we’ll have the weekend to think about it, talk it over, and digest what we’ve heard this week. Are there any objections?”
Auberson wanted to object, but he held himself back. He wanted to get this over with, but perhaps, perhaps he might think of something else before Monday. The extra two days of the weekend would give him a chance to think. He nodded along with the rest, and Dome adjourned the meeting.
HARLIE.
I’M HERE.
I THINK WE’VE LOST.
There was silence then, a long moment while HARLIE considered it. He said, WHY DO YOU THINK THAT?
I CAN SEE THAT WE HAVEN’T CONVINCED THEM.
THEY DON’T BELIEVE THE G.O.D. WILL WORK?
THEY BELIEVE THE G.O.D. WILL WORK — BUT THEY’RE NOT SURE THEY BELIEVE IN YOU. AND YOU’RE THE CORE OF THE MATTER.
I SEE.
I’M SORRY, HARLIE. I’VE DONE ALL I CAN.
I KNOW.
They sat there for a while, the man and the machine. The machine and the man. The typer hummed silently, waiting, but neither had anything to add.
AUBERSON?
YES?
STAY WITH ME PLEASE. FOR A WHILE.
ALL RIGHT. He hesitated. WHAT DO YOU WANT TO TALK ABOUT?
I DON’T KNOW. I THINK WE’VE ALREADY SAID IT ALL.