The Forever Man (48 page)

Read The Forever Man Online

Authors: Gordon R. Dickson

BOOK: The Forever Man
2.44Mb size Format: txt, pdf, ePub

“You were able to duplicate them?” asked Jim.

“Oh, yes. Chemically, at least. Of course, God knows what our version tastes like to a Laagi; but we knew the two in the ship were watching us outside their hull with their ship's instruments; and, sure enough, after we'd left them alone for a while with the extra room and the two containers of cubes, one of them ventured out, picked up the container of original cubes, plus one of the duplicates we'd made and took it all back inside. Evidently our version went down all right; because they eventually came out, got the rest of our cubes and ate them.”

“Not a very happy life, being prisoners,” said Jim. He looked over at the back of Mary's head, but it did not move and she did not say a word.

“No. But then you can't expect it to be,” answered Mollen. “We're making good progress toward being able to talk with them, using that notion you passed on in your debriefing, by the way, Jim. You know, the idea of one of us with a picture box strapped to his chest showing the image of a Laagi; and whoever it is speaks to a Laagi and the picture box translates his words into the image in the box, making the body movements that translate his words.”

“And you're close to this, already?” asked Jim.

“Close, no,” answered Mollen. “I said it was a final goal, and it is. But right now we're still working to really grasp that body language of theirs. You've heard of the ‘third language' technique?”

“No,” said Jim.

“Essentially, it means that if you've got two people, neither of whom can possibly ever speak the other's language, you invent a third language they can both speak. It's an outgrowth of the invented languages we were teaching chimpanzees and other animals as far back as the twentieth century, in order to communicate with them. There was a sign language, and a language of symbols different researchers used with different animals, and so forth…. Well, that's what we're trying to develop to use with our two Laagi, a third language.”

“And it can be done?” Jim asked.

“It can be done if both sides have enough elements in common. For example, as I say, it worked with chimpanzees and dogs and elephants and a few others, but they've never been able to make it work with cetaceans like dolphins and killer whales. Too different environmentally. We're just lucky that the Laagi've developed a technological civilization not too different from ours. We may not think the way they do, but we've got enough problems in common—like how to get from one star system to another by spaceship.”

“They're already talking about space flight with the Laagi?”

“Nowhere near that, yet, I'm afraid. First we had to build a sort of Laagi-instrument, in line with your idea. The technicians came up with a picture of a Laagi figure that could be made to make body movements the way they do. The movements were made by punching specific keys on a keyboard below the picture. Then we built a transparent section into the wall of the room we'd added around the Laagi port; and set the instrument up outside the window with a human operator seated at it, punching keys and making the figure move. Meanwhile, we were trying to isolate from the pictures you'd brought back of Laagi talking to each other at least a few body-movement words that our prisoners would recognize as attempts by us to talk to them.”

“You're using pictures from whatever got stuck to Squonk's tentacle, I suppose,” said Jim. He glanced over at Mary. But she still had not moved. Her face was still, hidden from where he sat, and there was no sign she was even listening to the conversation.

“That's right,” said Mollen, “we've got pictures of nearly every place you went in that city. The first big step was breaking the arm and body movements down into something roughly equivalent to action-units inside a given three-dimensional space, action-units small enough so that we could be sure each was all, or part of, no more than a single signal—you follow me?”

“No,” said Jim.

“The point was to get down to the basic building blocks of their body language. Where there were simultaneous movements of more than one part of the body, that was taken into account, too, but one way or another, all recorded body signals were listed and compared—thank God for thinking machines—then handed back to us in order of frequency, related to the conditions and situations under which they were being used, and so forth.”

“I figured something like that would have to be done,” said Jim. “It must have been a big job.”

“It was,” said Mollen. “But, little by little, the technicians began to pile up associations. You know—this movement goes with beginning to speak to someone else, this one with ending a conversation with that person. This one goes with greeting someone; this, with leaving an individual. Et cetera. And from all this we put together what should have meant ‘we want to talk to you.' We gave the Laagi a screen and keyboard in their outside room hooked to the screen and keyboard they could see through the window of that room, then had a technician sit down at the keyboard and type out ‘we want to talk to you'.”

“Where were the Laagi at this time?” asked Jim.

“In their ship, of course,” said Mollen. “Where they'd gone the minute we sent men in space suits in to set up the screen and keyboard for them. But we figured they'd be watching on their inside screens what went on in the outer room we'd made for them. Anyway, the technician kept sending the same message over and over.”

“What finally happened, sir?”

“One of the Laagi came out to the keyboard and screen in their outer room, spent several hours learning which keys to punch, and finally sent back a message we couldn't understand.”

“And the techs were stuck.”

“No, because they figured on that happening. They sent another message. This one said, ‘we want to talk to you with these,' and the screen showed some of the symbols for the artificial language they'd set up as the best bet to try to bridge the communication gap between us and the Laagi. The two of them took to the idea like ducks to water; and from there on it's been like teaching an artificial language to an animal—but a very smart one, of course.”

“And this worked?” Jim said.

“After a fashion. The artificial language's very limited, of course—you could guess that much. But now the technicians've been able to begin adding in movements of the Laagi figure where the symbols wouldn't convey precisely what we wanted, or they thought they understood a Laagi movement well enough to use it in something like the way it should be used. The Laagi caught on, as you might expect, and started correcting our errors; and from then on it's been progress.”

“How much progress?” asked Jim.

“We're just beginning to learn to talk to them, still, of course,” said Mollen. “Nobody knows how long it'll take; but I must say those two Laagi are cooperating. When one leaves the keyboard the other sits down at it.”

“Of course,” said Jim.

“Why ‘of course'?” asked Mollen curiously.

“Because they live to work. I told the debriefer that and Mary must have told hers the same thing. If you'd left them there much longer as simple prisoners with nothing to do, they'd probably have died. Now you've given them some reason to live.”

“At any rate,” said Mollen, “it's just a matter of time until we can really talk with them. Then the big job starts.”

“Getting them to understand that there's no point in both our races exhausting themselves in a war that'll do neither of us any good?” asked Jim.

Mollen looked him over.

“You've been thinking about this,” he said.

“I've had months to do it in,” said Jim. “Learning to communicate with the Laagi is one thing. Getting them to see things the way we see things is something that may never be done.”

“You? Pessimistic?” said Mollen. “That's a change.”

“I'm not pessimistic. Just realistic. We had trouble getting humans to agree with humans before this war with the Laagi started and we all had to work together. And the Laagi think a lot more differently from the way we do than any fellow human ever did. Also, they're not likely to change the way they think because of anything we can tell them. The most we can hope to do is sell them something.”

“Sell them something?” Mollen stared at him. “Jim, what makes you so sure about all this?”

“?1 and his little friends,” said Jim. “With one race of aliens to deal with, it's possible to make a whole lot of false assumptions. With two, it's possible to see from the bad guesses they make about each other where we could be making a bad guess or two about either one of them.”

“Those mind-people you ran into?” said Mollen. “Mary told us all about them.”

“With all due respect to Mary,” said Jim, looking at the unmoving back of her head, “I think she'll agree that while she's the expert with the Laagi, I'm the expert with the mind-people—in fact she said so once. Didn't you, Mary?”

“He's right,” said Mary, without moving. “You should listen to him, Louis.”

“I'll listen to anyone; but you'd be high on the list in any case,” said Mollen, throwing his heavy body back in his padded chair, so that the chair creaked and tilted slightly away from the desk. “What're you saying?”

“That I think the best we can hope to get out of being able to communicate with the Laagi is to offer them a deal where we help them to find the worlds they need so that they'll temporarily suspend trying to take over our world.”

“Oh—that,” said Mollen. “The business of giving them worlds in the mind-people's sector that you arranged with the mind-people.”

“It's far from arranged,” said Jim. “We can easily overlook the fact that the Laagi aren't built to give up on trying to take our world for living room for themselves; and we're just as likely to assume that what the mindpeople agree to today they'll still be in agreement with tomorrow. Both those notions about another race can be traps because they're based on the way we think ourselves, not the way the other race thinks.”

“I don't think I follow you.” Mollen's thick eyebrows came together.

“Remember how we used to wonder why a single Laagi ship would sometimes attack a whole wing of our fighters when it ran into them; while at other times a whole wing of them would turn and run from one ship of ours under practically the same conditions?”

“Yes. What about it?”

“The answer was simple. Tell him, Mary.”

Mary said nothing.

“All right,” said Jim. “I'll tell you, then. In both cases the drivers of the Laagi vessels were doing what they'd been told to do, not what reason would dictate they do.”

“But that's stupidity!” said Mollen. “They're too bright to be that stupid.”

“No, it's not stupidity. It's a whole world of difference in thinking. If they'd been told to use their own judgment, they'd have done so. But in the cases I'm talking about they'd simply been given a general order. What they did was no different from a human following orders even when he personally disagrees with them—with one exception. It's only under certain conditions that a Laagi allows himself to disagree on any subject; and one of those conditions isn't when he's at work. And what holds true for a Laagi individual holds true for the whole race. What I'm saying is that the Laagi could very well take a million habitable worlds, if we had that many to give them and still keep on trying to conquer humanity so they could have Earth.”

“Why?” demanded Mollen.

“Because getting Earth was something they started out to do. It was a job they started and haven't finished.”

“You're telling me,” said Mollen, “that even though events proved what they'd been doing wasn't necessary, they'd keep right on doing it?”

“That's what I'm saying, General. Finishing a job isn't something a Laagi reasons about. It's something that's built into the primitive part of their brains—just like a territorial response is built into us humans. It's so deep in us, we react to it according to the patterns of our various cultures, without even realizing why we stand a certain distance apart when we talk, why we avoid looking or deliberately look into the eyes of people when we talk to them. Built into the Laagi is the fact that there's no reason for his existence unless it involves doing work of one kind or another. And a job's not abandoned until it's finished. Any unlimited number of Laagi may have to be used up finishing it; but whatever the job needs to get it done, it's going to get. They're a race that never quits and never backs up, because they can't.”

“Man!” said Mollen, “you're telling me we can never make peace with them.”

“That's right,” said Jim. “Never.”

“All right,” said Mollen. “Then you tell me. What're we supposed to do?”

“We can't make peace with them as they are now,” said Jim. “But maybe we can arrange an indefinite pause in hostilities that just happens to last until they develop a different attitude—and that'll take generations. We've got to use the fact that the mind-people are there as leverage on the Laagi to keep the indefinite pause going, while at the same time we use the presence of the Laagi to make the mind-people keep their promises to us and the Laagi.”

“And how do you plan to do that?”

“Set ourselves up,” said Jim, “to the Laagi as the only force that keeps the mind-people from stopping them from working; and to the mind-people as the only people who can talk to the Laagi and explain how they mustn't get in the way of the art that the mind-people are spending the lifetime of their race developing—and it actually is one hell of an art, General. It may turn out to be greater than all our arts put together.”

“All right,” said Mollen; and while there was nothing in the general's tone to give it away, Jim had the uncomfortable feeling that Mollen was humoring him. “You want to set up a sort of triangular, three-race agreement that makes both the Laagi and the mindpeople do what we want, out of a combination of benefit and fear. Assuming you could show each of those races reasons to act that way to each other, why would they want humans in on the deal? What do they need us for, anyway?”

Other books

Archon by Lana Krumwiede
Dead End by Cynthia Harrod-Eagles
Reversed Forecast by Nicola Barker
People Park by Pasha Malla
I Beleive Now by Hurri Cosmo
Seasons of Love by Elizabeth Goddard
The Black Chalice by Marie Jakober