Read Destination: Void: Prequel to the Pandora Sequence Online
Authors: Frank Herbert
Chapter 11
Symbolic behavior of some order has to be a requisite of consciousness. And it must be noted that symbols abstract—they reduce a message to selected form.
—Morgan Hempstead,
Lectures at Moonbase
“Spread out that software on the bench, Tim,” Bickel directed. “Start by putting the pertinent parts of the loading plan on top of what we need’ll be in robot stores. I’ll be with you in a minute.”
Timberlake looked at Bickel’s back. Control had passed so obviously into the man’s hands. No one questioned it … now. He shrugged, began laying out the manifests and loading plans.
Bickel glanced around the room.
The computer maintenance shop was designed in such a way that Com-central nested partly into the curve of one wall. The shop presented one flat wall opposite Com-central, a wall about four and a half meters high and ten meters long—its face covered with plugboards, comparators, simultaneous multiplexers, buffer-system monitors, diagnostic instruments—dials and telltales.
Behind that wall’s hardware and shields lay the first banks of master-program routing that led down to core memory sections and the vast library of routines that marked out the limits of the equipment.
“We’ll have to block-sort the system to find all the audio and visual links and the AAT bands,” Bickel said. “It’s going to be a bootstrap operation all the way and the only information going back into the system will have to come from us. That means one of us will have to monitor the readout at all times. We’ll have to sort out the garbage as we go and keep a running check on every control sequence we use. Let’s start with a gate-circuit system right here.” Bickel indicated an optical character reader on the wall directly in front of him.
It was all clear to him—this entrance into the problem. If only he could keep this gate of his own awareness open—one step at a time.
But there remained the weight of those six previous failures … reasons unknown: more than eighteen thousand
people
lost.
They don’t think of us as real people,
Bickel told himself.
We’re expendable components, easily replaced.
What happened with the other six ships?
He wiped perspiration from his hands.
The conference hookups with station personnel had served only to frustrate him. He remembered sitting at his pickup desk staring into the vid-eye screen across his ink stained blotter, watching the movement of faces in the screen divisions—faces he knew only in an untouchable, secondhand way.
The memory was dominated by Hempstead’s voice issuing from that harsh wide mouth with its even rows of teeth:
“Any theory introduced to explain the loss of those ships must remain a theory at present. In the final analysis, we must admit we simply do not know what happened. We can only guess.”
Guesses:
System failure.
Mechanical failure.
Human failure.
And subdivisions within subdivisions to break down the rows of guesses.
But never a word of suspicion about the Organic Mental Cores. Not one hint or theory or guess. The
brains
were perfect.
“Why?” Bickel muttered, staring at the gauges of the computer panel.
The stacked schematics on the bench rustled as Timberlake looked up. “What?”
“Why didn’t they suspect OMC failure?” Bickel asked.
“Stupid mistake.”
“That’s too pat,” Bickel protested. “There’s something … some overriding reason we weren’t given all the facts.” He approached the computer panel, wiped away a small smudged fingerprint.
“What’re you getting at?” Timberlake asked.
“Think how easy it was to keep a secret from us. Everything we did or said or breathed or ate was under their absolute control. We’re the orbiting orphans, remember? Sterile isolation. The story of our lives: sterile isolation—physical … and mental.”
“That’s not reasonable,” Timberlake said. “There’re good reasons for sterile isolation, big advantages in a germ-free ship. But if you keep information from people who need it … well, that’s not optimum.”
“Don’t you ever get tired of being manipulated?” Bickel asked.
“Ahhhh, they wouldn’t.”
“Wouldn’t they?”
“But …”
“What do we really know about Tau Ceti Project?” Bickel asked. “Only what we’ve been told. Automatic probes were sent out. They say they found this one habitable planet circling Tau Ceti. So UMB began sending ships.”
“Well, why not?” Timberlake asked.
“Lots of reasons why not.”
“You’re too damn suspicious.”
“Sure I am. They tell us that because of the dangers, they send only duplicate-humans … Doppelgangers.”
“It makes sense,” Timberlake said.
“You don’t see anything suspicious in this setup?”
“Hell, no!”
“I see.” Bickel turned away from the glistening face of the computer panel, scowled at Timberlake. “Then let’s try another tack.
Don’t you find it at all difficult to focus on this problem of consciousness?”
“On what?”
“We have to make an artificial consciousness,” Bickel said. “That’s our main chance. Project knows it … so do we. Do you find it difficult to face this problem?’
“What problem?”
“You don’t think it’ll be much of a problem manufacturing an artificial consciousness?”
“Well …”
“Your life depends on solving it,” Bickel said.
“I guess so.”
“You guess so! D’you have an alternative plan?”
“We could turn back.”
Bickel fought down a surge of anger. “None of you see it!”
“See what?”
“The Tin Egg’s almost totally dependent on computer function. The AAT system uses computer translation banks. All our ship sensors are sorted through the computer for priority of presentation on Com-central’s screens. Every living soul in the hyb tanks has an individual life-system program—through the computer. The drive is computer governed. The crew life systems, the shields, the fail-safe circuits, hull integrity, the radiation reflectors …”
“Because everything was supposed to be left under the control of an OMC.”
Bickel crossed the shop in one low-gravity step, slapped a hand onto the papers on the bench. The movement sent several papers fluttering to the deck, but he ignored them. “And all the
brains
on six—no, seven!—ships failed! I can feel it right in my guts. The OMCs failed … and we weren’t given one word of warning.”
Timberlake started to speak, thought better of it. He bent, collected the schematics from the deck, replaced them on the bench. Something about the force of Bickel’s words, some product of vehemence prevented argument.
He’s right,
Timberlake thought.
Timberlake looked up at Bickel, noting the perspiration on the man’s forehead, the frown lines at the corners of his eyes. “We still could turn back,” Timberlake said.
“I don’t think we can. This is a one-way trip.”
“Why not? If we headed back …”
“And had a computer malfunction?”
“We’d still be headed home.”
“You call diving into the sun home?”
Timberlake wet his lips with his tongue.
“They used to teach kids to swim by tossing them into a lake,” Bickel said. “Well, we’ve been tossed into the lake. We’d better start swimming, or sure as hell we’re going to sink.”
“Project wouldn’t do that to us,” Timberlake whispered.
“Oh, wouldn’t they?”
“But … six ships … more than eighteen thousand people …”
“People? What people? The only losses I know about are ‘Gangers, fairly easy to replace if you have a cheap energy source.”
“We’re people,” Timberlake said, “not just Doppelgangers.”
“To
us
we’re people,” Bickel said. “Now, I’ve a real honey of a question for you—considering all those previous ship failures and the numerous possibilities of malfunction: Why didn’t Project give us a code for talking about failure of OMCs, ours … or any others?”
“These suspicions are … crazy,” Timberlake said.
“Yeah,” Bickel said. “We’re really on our way to Tau Ceti. Our lives are totally dependent on an all-or-nothing computer system—quite by the merest oversight. We’ve aimed ships like ours all over the sky—at Dubhe, at Schedar, at Hamal, at—”
“There was always the off chance those other six ships made it. You know that. They disappeared, sure, but—”
“Ahhhh, now we get down to the meat. Maybe they weren’t failures, eh? Maybe they—”
“It wouldn’t make sense to send two seeding ships to the same destination,” Timberlake pointed out. “Not if you weren’t sure what was happening to—”
“You really believe that, Tim?”
“Well …”
“I have a better suggestion, Tim. If some crazy bastard tossed you into a lake when you couldn’t swim, and you learned to swim like that”—Bickel snapped his fingers—”and you found then you could just keep on going, wouldn’t you swim like hell to get away from the crazy bastard?”
Chapter 12
DEMAND: Define God.
OMC: The whole is greater than the sum of its parts.
DEMAND: How can God contain the universe?
OMC: Study the hologram. The individual is both laser and target.
—Fragment from Message Capsule #4, thought to have originated with Flattery (#4B) model
In Com-central, the sounds were those the umbilicus crew had come to accept as normal—the creak of action couches in their gimbals, the click of an occasional relay as it called attention to a telltale on the big board.
“Has Bickel unburdened himself at all about the artificial consciousness project back at UMB?” Prudence asked.
She removed her attention momentarily from the master console, glanced at Flattery, her sole companion on the lonely watch. Flattery appeared a bit pale, his mouth drawn downward in a frown. She returned her attention to the console, noting on the time log that her shipwatch had a little more than an hour yet to run. The strain was beginning to drag at her energy reserves. Flattery was taking a hell of a long time to answer, she thought … but he was famous for the ponderous reply.
“He’s said a little,” Flattery said, and he glanced at the hatch to the computer maintenance shop where Bickel and Timberlake we’re working. “Prue, shouldn’t we be listening in on them, making sure they—”
“Not yet,” she said.
“They wouldn’t have to know we were listening.”
“You underestimate Bickel,” she said. “That’s about the worst mistake you can make. He’s fully capable of throwing a trace meter onto the communications—as I have—just on the off chance something interesting’ll turn up … like finding us listening.”
“D’you think he’s started … building?’
“Mostly preparation at this stage,” she said. “They’re collecting material. You can pretty well follow their movements by watching the power drain here on the board, the shifts in temperature sensors and the dosimeter repeaters and the drain on the robox cargo handlers.”
“They’ve been out into the cargo sections?”
“One of them has … probably Tim.”
“You know what Bickel said about the UMB attempt?” Flattery asked. He paused to scratch an itch under his chin. “Said the biggest failure was in attention—the experts wandering away, doing everything but keeping their attention on the main line.”
“That’s a little too warm for comfort,” she said.
“He may suspect,” Flattery said, “but he can’t be certain.”
“There you go underestimating him again.”
“Well, at least he’s going to need our help,” Flattery said, “and we’ll be able to tell what’s going on from
how
he needs us.”
“Are you sure he needs us?”
“He’ll have to use you for his deeper math analysis,” Flattery said. “And me … well, he’s going to be plowing through the von Neumann problem before he gets much beyond the first steps. He may not’ve faced that yet, but he’ll have to when he realizes he has to get deterministic results from unreliable hardware.”
She turned to stare at him, noting the faraway look in his eyes. “How’s that again?”
“He has to build with nonliving matter.”
“So what?” She returned her attention to the board. “Nature makes do with the same stuff. Living systems aren’t living below the molecular level.”
“And
you
underestimate … life,” Flattery said. “The basic elements Bickel has to use are from our robot stores—reels of quasibiological neurons and solid-state devices, nerex wire and things like that—all of it nonliving at a stage far above the molecular.”
“But their fine structure’s as relevant to their function as any living matter’s is.”
“Perhaps you’re beginning to see the essential hubris in even approaching this problem,” Flattery said.
“Oh, come off that,
Chaplain.
We’re not back in the eighteenth century making Vaucanson’s wonderful duck.”
“We’re tackling something much more complex than primitive automata, but our intention’s the same as Vaucanson’s.”
“That’s absolutely not true,” Prudence said. “If we succeeded and took our machine back to Vaucanson’s time and showed it to him, he’d just marvel at our mechanical ability.”
“You miss the mark. Poor Vaucanson would run for the nearest priest and volunteer for the lynch mob to do away with us. You see,
he
never intended to make anything that was really alive.”
“It’s only a matter of degree, not basic difference,” she protested.
“He was like Aladdin rubbing the lamp compared to us,” Flattery said. “And even if his intentions
were
the same as ours, he wasn’t aware of it.”
“You’re talking in circles.”
“Am I, really? This is the thing that writers and philosophers have skirted for centuries with their eyes half averted. This is the monster out of folklore, Prue. This is Frankenstein’s poor monster and the sorcerer’s apprentice. The very idea of building a conscious robot can be faced only if we recognize the implicit danger—that we may be building a Golem that’ll destroy us.”
“In your off hours you tell ghost stories.”
“Laughter’s as good a way as any of facing this fear,” he said.
“You’re really serious!” she accused.
“Never more serious. Why d’you suppose Project’s so happy to send us far out into space to do our work?”
She tried to swallow in a dry throat, realized she
was
afraid. Flattery had touched a nerve. He had produced a powerful truth from somewhere. She forced herself to face this as a fact when she felt an urge to call the computer shop and beg Bickel and Tim to stop whatever they were doing. The urge sent a chill along her spine.
“Where do we draw the line between what’s living and what’s inanimate?” Flattery asked. He studied her, seeing the fatigue shadows under her eyes, the trembling of a nerve at her temple. “Will our … creature be alive?”
She cleared her throat. “Wouldn’t it be more to the point to ask if our creature will be able to reproduce itself? If there’s any danger … any real danger to—”
“Then, indeed, we may be on forbidden ground.” And he wondered why this thought always brought such an empty feeling in the pit of his stomach.
“Oh, for God’s sake, Raj!” Prudence was vehement. “Have you completely forgotten that you’re a scientist?”
“For God’s sake, I can never forget it,” he said quietly.
“Stop that!” She realized her voice unconsciously had assumed the peremptory tone of her dormitory mother back in the UMB crèche. Dormitory mother! A gray-haired image whose touch was never more than the padded flexor of a robot which she directed from some remote sanctuary in Project Central. Such a sad woman she’d been, so cynical and … remote.
“Religion makes demands that can’t be denied unless you’re willing to pay a terrible price,” Flattery said.
“Religion’s just a fact like any other,” Prue countered. “We investigate primitive religions. Why can’t we investigate our own? Didn’t God make us curious? Aren’t we as scientists supposed to put ourselves beyond the reach of prejudice?”
“Only a fool imagines he’s beyond the reach of his prejudices.”
“Well, I prefer to be a Calvinist, I’m willing to be damned for the greater glory of God.”
“You mustn’t say such things,” he snapped. He put a hand to his head, thinking:
I mustn’t let her goad me this way.
“You can’t show me anything I mustn’t say,” she said. “You claim scientists can equate God with ideas of mathematical infinity. We manipulate mathematical infinity; why can’t we manipulate God?”
“What silly pretensions,” he said. “Mathematical infinity. Zero over zero, eh? Or infinity minus infinity? Or infinity times zero?”
“God times zero,” she said. “Why not?”
“You’re the mathematician!” he said, his voice pouncing. “You know better’n anyone that these are indeterminate forms, mathematical nonsense.”
“God minus infinity. Mathematical nonsense.”
He glared at her. His throat felt dry and burning. She’d tricked him into this corner. It was blasphemy! And he was more vulnerable than she was … guiltier.
“You’re supposed to be doing this to me, aren’t you,” he accused. “You’re supposed to push me and test me, give me no peace. I know.”
How little he knows … or even suspects,
she thought. “Infinity doesn’t follow the conditions of number or quantity. If there’s a God, I don’t see why He should follow those conditions, either. As for testing you: horseradish! All you need’s an occasional kick in the philosophy.”
“Stick to my preaching and let you play with the math, is that it?”
“There’s no blasphemy in developing a new kind of calculus or any other new tool to deal with our universe,” she said.
“Our
universe?” Flattery asked.
“As much of it as we can take,” she said. “That’s the whole idea of a colony ship, isn’t it?”
“Is it?” he asked.
She adjusted the course-constant repeater, said, “I’ll stick to math. How about a calculus that goes beyond the limits of X over Y as they tend toward infinity? That should be possible.”
“Creating a new kind of calculus and building this living, sentient creature aren’t the same,” he said.
“Without the calculus we may never achieve the creature.”
She keeps trying to corner me,
he thought.
Why?
“The issue’s whether we’re intruding on God’s domain of creation.”
“You Holy Joes are all alike. You want to glorify God but you’d limit the means.”
Flattery stared at the curved gray metal of the bulkhead above him, seeing the tiny imperfections in the crackle patern of its finish. He felt he was being maneuvered. She was stalking him the way a man might stalk game. Was it his soul she was after? He sensed he was in profound danger, that the idea of consciousness as something they could create might inflict itself on his soul as an incurable wound.
He put a hand to his mouth.
I cannot permit her to bait me and tempt me.
“Raj,” she whispered and there was terror in her voice.
He whirled toward her, seeing the streaks of light across the big board like red knife slashes.
“We’re almost at red-line temperature in Sector C-8 of the hyb tanks,” she said. “Everything I do seems to make the system oscillate.”
Flattery’s hands flashed out to the life-systems repeater switches, brought his own monitors alive. He scanned the instruments, commanded, “Call Tim.”
“Nothing I do seems to work!” she panted.
He glanced at her, saw she was fighting the board, not working with it.
“Call Tim!” he said.
She hit the command circuit switch with the heel of her left hand, shouted, “Tim to Com-central! Emergency!”
Again, Flattery scanned his instruments. There appeared to be three points of temperature shift outside the hyb tanks with corresponding variation inside. As Prue tried to compensate for one fluctuation, the others fell farther toward the red.
He had to force himself to keep his hands off the controls. If tank temperature went into the red without dehyb precautions, there’d be deaths among the helpless occupants. Despite Prue’s desperate efforts, death was approaching three sectors of the C-8 tank—some four hundred human lives in there.
The hatch from the computer shop banged open. Timberlake leaped through with Bickel right behind.
“Hyb tanks,” Prudence gasped. “Temperature.”
Timberlake threw himself across Com-central into his action couch. His vacsuit rasped against the cocoon lips as he turned, grasped the traveler controls. “Give me the red switch,” he snapped. “To hell with the count! I’m taking it.”
And he took it, the big board swinging across much too fast.
“C-8,” she said, sinking back and wiping perspiration from her forehead.
“I’ve got it,” he said. He scanned the dials and gauges, his fingers playing over the console.
Bickel slipped into his own couch, tripped his repeaters. “It’s in the hull shielding,” he said.
“First two layers,” Timberlake said.
Prudence put a hand to her throat, tried not to look at Bickel.
He mustn’t suspect our attention’s on him,
she thought. Then:
Wouldn’t it be monstrous irony to lose our colonists and burden ourselves with guilt before the need for it?
“That’s doing it,” Bickel said.
She looked across the board above Timberlake, saw the warning telltales winking out, the dials swinging back into normal range.
“Faulty feedback for a patch of our shell reflectors focused on C-8,” Timberlake said. “The system started to oscillate and that threw the overload switches, left us wide open.”
“Another
design
failure,” Bickel sneered.
And such a simple problem
, Bickel thought. The hull curve acted like a lens to focus energy within the ship … unless reflector and shell shielding systems compensated.
Prudence traced the line of the remaining telltales. “C-8’s on a line with that robot stores section you raided. Is that all it takes to throw the ship off balance?”
“Gives you a wonderful feeling of confidence in the Tin Egg’s design, doesn’t it,” Bickel said.
They didn’t warn me!
she thought.
They cheated, Calculated emergencies, they said, just enough to keep a fine edge on your reaction abilities. Reaction abilities!
“You overcompensated, Prue,” Timberlake said. “Make minimal adjustments to avoid oscillation while you hunt for the source of your trouble. You had sensor telltales flaring right out through the ship to pinpoint where you needed shielding reinforcement.”
I panicked,
she thought. “I guess I let myself get too tired.” Even as she spoke she sensed how lame the excuse sounded.
I was too intent doing the job on Flattery,
she thought.
I had him headed for a nice corner where he’d have to fight his way out
…
and I missed the ship trouble until it was almost ready to wreck us.
It occurred to her then to wonder if one of the crew had
her
as a “special project” to keep her abilities toned up … on edge.
“Prue, you’ve got to remember that when the overload switches go, the computer automatics are out of the circuit,” Bickel said. “This thing was designed to be brought back into line by a conscious intelligence—one of us or an OMC.”