The Positronic Man (14 page)

Read The Positronic Man Online

Authors: Isaac Asimov,Robert Silverberg

Tags: #sf, #Fiction, #General, #Science Fiction, #Science fiction; American, #Technology & Engineering, #Psychological fiction, #Movie novels, #Robots, #Robotics, #Collaborative novels, #Robots - Fiction, #Futurism, #Movie released in 1999

BOOK: The Positronic Man
3.8Mb size Format: txt, pdf, ePub

But at least the principle of robot rights-established originally by the decree awarding Andrew his "freedom"-had been extended a little further.

The final approval by the World Court came through on the day of Little Miss's death.

That was no coincidence. Little Miss, very old and very weak now, had nevertheless held on to life with desperate force during the closing weeks of the debate. Only when word of victory arrived did she at last relax the tenacity of her grip.

Andrew was at her bedside when she went. He stood beside her, looking down at the small, faded woman propped up among the pillows and thinking back to those days of nearly a hundred years before when he was newly arrived at the grand coastside mansion of Gerald Martin and two small girls had stood looking up at him, and the smaller one had frowned and said, "En-dee-arr. That isn't any good. We can't call him something like that. What about calling him Andrew?"

So long ago, so very long ago. A whole lifetime ago, so far as things went for Little Miss. And yet to Andrew it sometimes seemed only a moment-hardly any time at all since those days when he and Miss and Little Miss had romped on the beach below the house, and he had gone for a swim in the surf because it had pleased them to ask him to do so.

Nearly a century.

For a human being, Andrew knew, that was an enormous span of time.

And now Little Miss's life had run its course and was speeding away. The hair that once had been a radiant gold had long since turned to shining silver; but now the last of its gleam was gone from it and for the first time it looked dull and drab. She was coming to her termination, and there was no help for that. She was not ill; she was simply worn out, beyond any hope of repair. In another few moments she would cease to function. Andrew could hardly imagine a world that did not contain Little Miss. But he knew that he was entering such a world now.

Her last smile was for him. Her last words were, "You have been good to us, Andrew."

She died with her hand holding his, while her son and his wife and their children remained at a respectful distance from the robot and the old woman in the bed.

Thirteen

ANDREW EXPERIENCED a sensation of discomfort after Little Miss's death that would not leave him for weeks. To call it grief might be a little too strong, he thought, for he suspected that there was no place in his positronic pathways for any feeling that corresponded exactly to the human emotion known as grief.

And yet there was no question but that he was disturbed in some way that could only be traced to the loss of Little Miss. He could not have quantified it. A certain heaviness about his thoughts, a certain odd sluggishness about his movements, a perception of general imbalance in his rhythms-he felt these things, but he suspected that no instruments would be able to detect any measurable change in his capacities.

To ease this sensation of what he would not let himself call grief he plunged deep into his research on robot history, and his manuscript began to grow from day to day.

A brief prologue sufficed to deal with the concept of the robot in history and literature-the metal men of the ancient Greek myths, the automata imagined by clever storytellers like E. T. A. Hoffmann and Karel Capek, and other such fantasies. He summarized the old fables quickly and dispensed with them. It was the positronic robot-the real robot, the authentic item-that Andrew was primarily concerned with.

And so Andrew moved swiftly to the year 1982 and the incorporation of United States Robots and Mechanical Men by its visionary founder, Lawrence Robertson. He felt almost as though he were reliving the story himself, as he told of the early years of struggle in drafty converted-warehouse rooms and the first dramatic breakthrough in the construction of the platinum-iridium positronic brain, after endless trial-and-error. The conception and development of the indispensable Three Laws; research director Alfred Lanning's early triumphs at designing mobile robot units, clumsy and ponderous and incapable of speech, but versatile enough to be able to interpret human orders and select the best of a number of possible alternative responses. Followed by the first mobile speaking units at the turn of the Twenty-First Century.

And then Andrew turned to something much more troublesome for him to describe: the period of negative human reaction which followed, the hysteria and downright terror that the new robots engendered, the worldwide outburst of legislation prohibiting the use of robot labor on Earth. Because miniaturization of the positronic brain was still in the development stage then and the need for elaborate cooling systems was great, the early mobile speaking units had been gigantic-nearly twelve feet high, frightful lumbering monsters that had summoned up all of humanity's fears of artificial beings-of Frankenstein's monster and the Golem and all the rest of that assortment of nightmares.

Andrew's book devoted three entire chapters to that time of extreme robot-fear. They were enormously difficult chapters to write, for they dealt entirely with human irrationality, and that was a subject almost impossible for Andrew to comprehend.

He grappled with it as well as he could, striving to put himself in the place of human beings who-though they knew that the Three Laws provided foolproof safeguards against the possibility that robots could do harm to humans-persisted in looking upon robots with dread and loathing. And after a time Andrew actually succeeded in understanding, as far as he was able, how it had been possible for humans to have felt insecure in the face of such a powerful guarantee of security.

For what he discovered, as he made his way through the archives of robotics, was that the Three Laws were not as foolproof a safeguard as they seemed. They were, in fact, full of ambiguities and hidden sources of conflict. And they could unexpectedly confront robots-straightforward literal-minded creatures that they were-with the need to make decisions that were not necessarily ideal from the human point of view.

The robot who was sent on a dangerous errand on an alien planet, for example-to find and bring back some substance vital to the safety and well-being of a human explorer-might feel such a conflict between the Second Law of obedience and the Third Law of self-preservation that he would fall into a hopeless equilibrium, unable either to go forward or to retreat. And by such a stalemate the robot-through inaction-thus could create dire jeopardy for the human who had sent him on his mission, despite the imperatives of the First Law that supposedly took precedence over the other two. For how could a robot invariably know that the conflict he was experiencing between the Second and Third Laws was placing a human in danger? Unless the nature of his mission had been spelled out precisely in advance, he might remain unaware of the consequences of his inaction and never realize that his dithering was creating a First Law violation.

Or the robot who might, through faulty design or poor programming, decide that a certain human being was not human at all, and therefore not in a position to demand the protection that the First and Second Laws were supposed to afford

Or the robot who was given a poorly phrased order, and interpreted it so literally that he inadvertently caused danger to humans nearby

There were dozens of such case histories in the archives. The early roboticists-most notably the extraordinary robopsychologist, Susan Calvin, that formidable and austere woman-had labored long and mightily to cope with the difficulties that kept cropping up.

The problems had become especially intricate as robots with more advanced types of positronic pathways began to emerge from the workshops of U. S. Robots and Mechanical Men toward the middle of the Twenty-First Century: robots with a broader capacity for thought, robots who were able to look at situations and perceive their complexities with an almost human depth of understanding. Robots like-though he took care not to say so explicitly-Andrew Martin himself. The new generalized-pathway robots, equipped with the ability to interpret data in much more subjective terms than their predecessors, often reacted in ways that humans were not expecting. Always within the framework of the Three Laws, of course. But sometimes from a perspective that had not been anticipated by the framers of those laws.

As he studied the annals of robot development, Andrew at last understood why so many humans had been so phobic about robots. It wasn't that the Three Laws were badly drawn-not at all. Indeed, they were masterly exemplars of logic. The trouble was that humans themselves were not always logical-were, on occasion, downright illogical-and robots were not always capable of coping with the swoops and curves and tangents of human thought.

So it was humans themselves who sometimes led robots into violations of one or another of the Three Laws-and then, in their illogical way, often would blame the robots themselves for having done something undesirable which in fact they had actually been ordered to do by their human masters.

Andrew handled these chapters with the utmost care and delicacy, revising and revising them to eliminate any possibility of bias. It was not his intention to write a diatribe against the flaws of mankind. His prime goal, as always, was to serve the needs of mankind.

The original purpose of writing his book might have been to arrive at a deeper understanding of his own relationship to the human beings who were his creators-but as he proceeded with it he saw that, if properly and thoughtfully done, the book could be an invaluable bridge between humans and robots, a source of enlightenment not only for robots but for the flesh-and-blood species that had brought them into the world. Anything that enabled humans and robots to get along better would permit robots to be of greater service to humanity; and that, of course, was the reason for their existence.

When he had finished half his book, Andrew asked George Charney to read what he had written and offer suggestions for its improvement. Several years had passed since the death of Little Miss, and George himself seemed unwell now, his once robust frame gaunt, his hair nearly gone. He looked at Andrew's bulky manuscript with an expression of barely masked discomfort and said, "I'm not really much of a writer myself, you know, Andrew."

"I'm not asking for your opinion of my literary style, George. It's my ideas that I want you to evaluate. I need to know whether there's anything in the manuscript that might be offensive to human beings."

"I'm sure there isn't, Andrew. You have always been the soul of courtesy."

"I would never knowingly give offense, that is true. But the possibility that I would inadvertently-"

George sighed. "Yes. Yes, I understand. All right, I'll read your book, Andrew. But you know that I've been getting tired very easily these days. It may take me a while to plow all the way through it."

"There is no hurry," said Andrew.

Indeed George took his time: close to a year. When he finally returned the manuscript to Andrew, though, there was no more than half a page of notes attached to it, the most minor factual corrections and nothing more.

Andrew said mildly, "I had hoped for criticisms of a more general kind, George."

"I don't have any general criticisms to make. It's a remarkable work. Remarkable. It's a truly profound study of its subject. You should be proud of what you've done."

"But where I touch on the topic of how human irrationality has often led to Three Laws difficulties-"

"Absolutely on the mark, Andrew. We are a sloppy-minded species, aren't we? Brilliant and tremendously creative at times, but full of all sorts of messy little contradictions and confusions. We must seem like a hopelessly illogical bunch to you, don't we, Andrew?"

"There are times that it does seem that way to me, yes. But it is not my intention to write a book that is critical of human beings. Far from it, George. What I want to give the world is something that will bring humans and robots closer together. And if I should seem to be expressing scorn for the mental abilities of humans in any way, that would be the direct opposite of what I want to be doing. Which is why I had hoped that you would single out, in your reading of my manuscript, any passages that might be interpreted in such a way that-"

"Perhaps you should have asked my son Paul to read the manuscript instead of me," George said. "He's right at the top of his profession, you know. So much more in touch with all these matters of nuance and subtle inference than I am these days."

And Andrew finally understood from that statement that George Charney had not wanted to read his manuscript at all-that George was growing old and weary, that he was entering the final years of his life, that once again the wheel of the generations had turned and that Paul was now the head of the family. Sir had gone and so had Little Miss and soon it was going to be George's turn. Martins and Charneys came and went and yet Andrew remained-not exactly unchanging (for his body was still undergoing occasional technological updating and it also seemed to him that his mental processes were constantly deepening and growing richer as he allowed himself to recognize fully his own extraordinary capabilities), but certainly invulnerable to the ravages of the passing years.

He took his nearly finished manuscript to Paul Charney. Paul read it at once and offered not only praise but, as George had indicated, valuable suggestions for revision. There were places where Andrew's inability to comprehend the abrupt, non-linear jumps of reasoning of which the human mind is capable had led him into certain oversimplifications and unwarranted conclusions. If anything, Paul thought the book was too sympathetic to the human point of view. A little more criticism of the irrational human attitude toward robotics, and toward science in general, might not have been out of place.

Andrew had not expected that.

He said, "But I would not want to offend anyone, Paul."

"No book worth reading has ever been written that didn't manage to offend someone," Paul replied. "Write what you believe to be the truth, Andrew. It would be amazing if everybody in the world agreed with you. But your viewpoint is unique. You have something real and valuable to give the world here. It won't be worth a thing, though, if you suppress what you feel and write only what you think others want to hear."

"But the First Law-"

"Damn the First Law, Andrew! The First Law isn't everything! How can you harm someone with a book? Well, by hitting him over the head with it, I suppose. But not otherwise. Ideas can't do harm-even wrong ideas, even foolish and vicious ideas. People do the harm. They seize hold of certain ideas, sometimes, and use them as the justification for doing unconscionable, outrageous things. Human history is full of examples of that. But the ideas themselves are just ideas. They must never be throttled. They need to be brought forth, inspected, tested, if necessary rejected, right out in the open. -Anyway, the First Law doesn't say anything about robots writing books. Sticks and stones, Andrew-they can do harm. But words-"

"As you yourself have just remarked, Paul, human history is full of harmful events that began simply with words. If those words had never been uttered, the harmful events would not have taken place."

"You don't understand what I'm saying, do you? Or do you? I think you do. You know what power ideas have, and you don't have a lot of faith in the ability of humans to tell a good idea from a bad one. Well, neither do I, sometimes. But in the long run the bad idea will perish. That's been the story of human civilization for thousands of years. The good does prevail, sooner or later, no matter what horrors have happened along the way. And so it's wrong to suppress an idea that may have value to the world. -Look, Andrew: you're probably the closest thing to a human being that has ever come out of the factories of U. S. Robots and Mechanical Men. You're uniquely equipped to tell the world what it needs to know about the human-robot relationship, because in some ways you partake of the nature of each. And so you may help to heal that relationship, which even at this late date is still a very troubled one. Write your book. Write it honestly."

"Yes. I will, Paul."

"Do you have a publisher in mind for it, by the way?"

"A publisher? Why, no. I haven't yet given any thought to-"

Other books

The Olive Conspiracy by Shira Glassman
Beyond the Grave by C. J. Archer
Good Enough For Nelson by John Winton
Wolfsbane by Andrea Cremer
Split at the Seams by Yolanda Sfetsos
Hate List by Jennifer Brown
beats per minute by Alex Mae