Isaac Asimov: The Foundations of Science Fiction (Revised Edition) (12 page)

BOOK: Isaac Asimov: The Foundations of Science Fiction (Revised Edition)
11.29Mb size Format: txt, pdf, ePub
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Typically, Asimov always insisted that Campbell originated the laws, and Campbell always said that they were implicit in Asimov's stories and discussions. Whatever the exact truth of origin, the Three Laws, as Asimov noted in his autobiography, "revolutionized science fiction":
Once they were established in a series of stories, they made so much sense and proved so popular with the readers that other writers began to use them. They couldn't quote them directly, of course, but they could simply assume their existence, knowing well that the readers would be acquainted with the Laws and would understand the assumption.
Campbell may have worked more intuitively than through conscious theory, but what he wanted was a rational inspection of all premises; a movement away from traditional responses, primarily emotional and irrational, toward pragmatism; and the construction of new and more logical systems of operation. Campbell, as a writer, also may have perceived the fictional opportunities the Laws of Robotics would provide.
As Asimov noted in
The Rest of the Robots,
"There was just enough ambiguity in the Three Laws to provide the conflicts and uncertainties required for new stories, and, to my great relief, it seemed always to be
possible to think up a new angle out of the sixty-one words of the Three Laws."
4. The mention of the First Law in "Robbie" and of the Three Laws in "Reason" clearly are interpolations for the 1950 book, as is the mention of Susan Calvin in "Robbie" and "Runaround."
The Asimov robot stories as a whole may respond best to an analysis based on that ambiguity and on the ways in which Asimov played 40 variations upon a theme. The importance to the evolution of science fiction, at least in the period between 1940 and 1950, was that this was an intellectual development. The emotional response the fear of the machine, the fear of the creature turning on its creator was derided. In the robot stories, such responses are characteristic of foolish, unthinking people, religious fanatics, short-sighted labor unions. The Frankenstein complex may be observably true in human nature (and this, along with its appeal to human fears of change and the unknown, may explain its persistence in literature), but it is false to humanity's intellectual aspirations to be rational and to build rationally. Blind emotion, sentimentality, prejudice, faith in the impossible, unwillingness to accept observable truth, failure to use one's intellectual capacities or the resources for discovering the truth that are available, these were the evils that Campbell and Asimov saw as the sources of human misery. They could be dispelled, they thought, by exposure to ridicule and the clear, cool voice of reason, though always with difficulty and never completely.
"Robbie," for instance, considers the question of unreasoning opposition to robots: Grace Weston's concern about Robbie, the villagers' fear of him, New York's curfew for robots. Mrs. Weston, who herself has an unreasoning determination to get rid of Robbie, says, "People aren't reasonable about these things." The climax of the story, in which Robbie moves swiftly to save Gloria from being run down by a tractor, makes clear the advantages of the robot's single-minded concern for its function and its instantaneous response to a crisis that paralyzes Gloria's parents for vital heartbeats.
"Runaround" is an exercise in the conflict between two of the Three Laws. Speedy, a valuable new robot designed for use in the mines of Mercury, has been ordered to get selenium from a pool. But he is found circling the pool acting drunk, and it turns out that carbon monoxide released by volcanic activity in the area can combine with iron to form volatile iron carbonyl. At a certain distance from the pool Speedy's instinct for self-preservation (the Third Law) exactly balances the necessity to obey orders (the Second Law). Powell is able to break Speedy out of his deadly circle only by placing himself in danger so that Speedy must rescue him (the First Law).
Many of the stories develop from unforeseen consequences of the
creation of new robots (sometimes complicated by inaccurate or unspecific orders, as in "Runaround"); others come about through accident. Both stem naturally from Asimov's premise that unforeseen consequences or accidents are eventualities that rational persons cannot guard against.
"Catch That Rabbit" concerns a master robot with six sub-robots who is created for asteroid mining but occasionally malfunctions when not watched and cannot remember why. It turns out that six sub-robots are too many for Davie to handle in an emergency. When Donovan and Powell discover this, partly by accident, they are able to pinpoint the affected part of Davie's positronic brain, the part that is stressed by a six-way order.
"Liar!" begins with the accidental creation of a telepathic robot, Herbie. Herbie is asked to tell each of the characters what he has learned from reading other characters' minds, and because he cannot "harm" them, according to the First Law, he tells them what they want to hear. In particular, he tells plain, spinsterish Susan Calvin that the man she loves, Milton Ashe, is in love with her. When they all discover that Herbie has been lying to them, Susan drives Herbie insane by forcing on him the dilemma that no matter what he does he will be hurting someone.
"Little Lost Robot" brings in a search for a hyperatomic (interstellar) drive at a base in the asteroids. A new kind of robot, the Nestor series, has been created to work with scientists in dangerous situations from which ordinary robots would pull the scientists to safety. Some Nestors have not been impressioned, therefore, with the entire First Law, and one of them is told (the Second Law) by an irritated scientist to get lost. The variation Asimov used here was the conditions under which the First Law would have to be relaxed, those conditions being when robots had to discriminate between dangers, and the possible problems this might involve. When the Nestor hides among identical robots and refuses to reveal itself, Susan Calvin attempts to force it into the open by placing a man in danger. At first, all the robots spring to save the man. In a second, slightly different experiment, they all remain seated, having been convinced by the hiding Nestor that any attempt to save the man could not succeed and they would only destroy themselves. In a final test, Susan places herself in danger. The malfunctioning Nestor reveals itself by recognizing that harmless infrared rays rather than dangerous gamma rays are involved and by forgetting, in its feeling of superiority, that the other robots have not been trained, as it has, to tell the difference.

Other books

The Best American Essays 2016 by Jonathan Franzen
Worldweavers: Spellspam by Alma Alexander
Widowmaker by Paul Doiron
Afloat and Ashore by James Fenimore Cooper
The Oath by Jeffrey Toobin
Possessed by Desire by Naughton, Elisabeth
Above the Bridge by Deborah Garner
Frostfire by Amanda Hocking
Grace Cries Uncle by Julie Hyzy