Alone Together (15 page)

Read Alone Together Online

Authors: Sherry Turkle

BOOK: Alone Together
10.83Mb size Format: txt, pdf, ePub
I study My Real Baby among children five through fourteen. Some play with the robot in my office. Some meet it in classrooms and after-school settings. Others take it home for two or three weeks. Because this is a robot that represents a baby, it gets children talking about family things, care and attention, how much they have and how much more they want. Children talk about working mothers, absent fathers, and isolated grandparents. There is much talk of divorce. Some children wonder whether one of this robot’s future cousins might be a reasonable babysitter; something mechanical might be more reliable than the caretaking they have.
1
Many of the children I study return to empty homes after school and wait for a parent or older family member to come home from work. Often their only babysitter is the television or a computer game, so in comparison a robot looks like pretty good company. Nicole is eleven. Both of her parents are nurses. Sometimes their shifts overlap, and when this happens, neither is home until late. Nicole thinks a robot might be comforting: “If you cut yourself and you want some sympathy. Or you had a bad day at school—even your best friend was mad at you. It would be better to not be alone when you came home.” Twelve-year-old Kevin is not so sure: “If robots don’t feel pain, how could they comfort you?” But the philosophical conversations of the late 1970s and 1980s are cut short: these children are trying to figure out if a robot might be good for them in the most practical terms.
The twenty children in Miss Grant’s fifth-grade class, in a public school on Boston’s North Shore, are nine and ten. They have all spent time with the AIBOs and My Real Babies that I brought to their school. Now we are about to begin a home study where one group of children after another will take a My Real Baby home for two weeks. Most take the position Wilson staked out with his Furby and Lester settled into with his AIBO. They are content to be with a machine that they treat as a living creature. Noah remarks that My Real Baby is very noisy when it changes position, but he is quick to point out that this is insignificant: “The whirring doesn’t bother me,” he says. “I forget it right away.”
In the robotic moment, what you are made of—silicon, metal, flesh—pales in comparison with how you behave. In any given circumstance, some people and some
robots
are competent and some not. Like people, any particular robot needs to be judged on its own merits. Tia says, “Some robots would be good companions because they are more efficient and reliable,” and then she pauses. I ask her to say more, and she tells me a story. She was at home alone with her pregnant mother, who quite suddenly went into labor. On short notice, they needed to find a babysitter for Tia. Luckily, her grandmother was close by and able to take over, but nevertheless, Tia found the incident frightening. “Having a robot babysitter would mean never having to panic about finding someone at the last minute. It is always ready to take care of you.” In only a few years, children have moved from taking care of Tamagotchis and Furbies to fantasies of being watched over by benign and competent digital proctors. The Tamagotchis and Furbies were always on. Here, a robot is thought of as “always ready.”
These fifth graders know that AIBO and My Real Baby are not up to the job of babysitter, but these robots inspire optimism that scientists are within striking distance. The fifth graders think that a robot could be a babysitter if it could manage babysitter
behavior
. In their comments about how a robot might pass that test, one hears about the limitations of the humans who currently have the job: “They [robots] would be more efficient than a human if they had to call for an emergency and had a phone right inside them.... They are more practical because if someone gets hurt they are not going to stress or freak out.” “They would be very good if you were sick and your mother worked.” “Robots would always be sure that you would have fun. People have their own problems.” Rather than a mere understudy, a robot could be better qualified to serve. Hesitations are equally pragmatic. One fifth grader points out how much air conditioners and garbage disposals break. “The robot might shut down” too.
In the 1980s, most children drew a line—marking a kind of sacred space—between the competencies of computers and what was special about being a person. In Miss Grant’s class, the sacred space of the romantic reaction is less important than getting the job done. Most of the children are willing to place robots and humans on an almost-level playing field and debate which can perform better in a given situation. To paraphrase, these pragmatic children say that if people are better at fun, let’s put them in charge of fun. If a robot will pay more attention to them than a distracted babysitter, let the robot babysit. If the future holds robots that behave lovingly, these children will be pleased to feel loved. And they are not dissuaded if they see significant differences between their way of thinking and how they imagine robots think. They are most likely to say that if these differences don’t interfere with how a robot performs its job, the differences are not worth dwelling on.
Children are not afraid to admit that when robots become caretakers, some things will be lost, things they will miss. But they also make it clear that when they say they will “miss” something (like having a mother at home to watch them when they are sick), it is not necessarily something they have or ever hope to. Children talk about parents who work all day and take night shifts. Conversations about families are as much about their elusiveness as about their resources.
On this almost-level playing field, attitudes about robotic companionship are something of a litmus test for how happy children are with those who care for them. So, children who have incompetent or boring babysitters are interested in robots. Those who have good babysitters would rather stick with what they have.
FROM MY REAL BABY TO MY REAL BABYSITTER
 
Jude is happy with his babysitter. “She is creative. She finds ways for us to have fun together.” He worries that a robot in her place might be too literal minded: “If parents say [to a person], ‘Take care of the kid,’ they [the person] won’t just go, ‘Okay, I’m just going to make sure you don’t get hurt.’ They’ll play with you; they’ll make sure you have fun too.” Jean-Baptiste agrees. Robot babysitters are “only in some ways alive.... It responds to you, but all it really thinks about is the job. If their job is making sure you don’t get hurt, they’re not going to be thinking about ice cream.” Or it might know that children like ice cream, but wouldn’t understand what ice cream was all about. How bad would this be? Despite his concerns, Jean-Baptiste says he “could love a robot if it was very, very nice to me.” It wouldn’t understand it was being nice, but for Jean-Baptiste, kindness is as kindness does.
Some children are open to a robot companion because people are so often disappointing. Colleen says, “I once had a babysitter just leave and go over to a friend’s house. A robot babysitter wouldn’t do that.” Even when they stayed around, her babysitters were preoccupied. “I would prefer to have a robot babysitter. . . . A robot would give me all its attention.” Octavio says that human babysitters are better than robots “if you are bored”—humans are able to make up better games. But they often get meals wrong: “What’s with the cereal for dinner? That’s boring. I should have pasta or chicken for dinner, not cereal.” Because of their “programming,” robots would know that cereal at night is not appropriate. Or, at least, says Octavio, robots would be programmed to take interest in his objections. In this way, the machines would know that cereal does not make a good dinner. Programming means that robots can be trusted. Octavio’s classmate Owen agrees. It is easier to trust a robot than a person: “You can only trust a person if you know who they are. You would have to know a person more [than a robot].... You wouldn’t have to know the robot, or you would get to know it much faster.”
Owen is not devaluing the “human kind” of trust, the trust built as people come through for each other. But he is saying that human trust can take a long time to develop, while robot trust is as simple as choosing and testing a program. The meaning of intelligence changed when the field of artificial intelligence declared it was something computers could have. The meaning of memory changed when it was something computers used. Here the word “trust” is under siege, now that it is something of which robots are worthy. But some of the children are concerned that a trustworthy, because consistent, robot might still fall short as babysitter for lack of heart. So Bridget says she could love a robot babysitter if it did a good job, but she is skeptical about the possibility. She describes what might occur if a robot babysitter were taking care of her and she scraped her knee: “It’s just going to be like, [in a robot voice] ‘Okay, what do I do, get a Band-Aid and put it on, that’s it. That’s my job, just get a Band-Aid and put it on.’ . . . [stops using robot’s voice] But to love somebody, you need a body and a heart. These computers don’t really have a heart. It’s just a brain.... A robot can get hurt, but it doesn’t really hurt. The robot just shuts down. When hurt, the robot says, ‘Right. Okay, I’m hurt, now I’ll shut down.’”
As Bridget speaks, I feel a chill. This “shutdown” is, of course, the behavior of My Real Baby, which shuts down when treated roughly. Bridget seizes upon that detail as a reason why a robot cannot have empathy. How easy it would be, how small a technical thing, to give robots “pretend empathy.” With some trepidation, I ask Bridget, “So, if the robot showed that it felt pain, would that make a difference?” Without hesitation she answers, “Oh yes, but these robots shut down if they are hurt.” From my perspective, the lack of robotic “empathy” depends on their not being part of the human life cycle, of not experiencing what humans experience. But these are not Bridget’s concerns. She imagines a robot that could be comforting if it performed pain. This is the behaviorism of the robotic moment.
There is little sentimentality in this classroom. Indeed, one of Miss Grant’s students sees people as potential obstacles to relationships with robots: “If you are already attached to your babysitter, you won’t be able to bond with a robot.” And this might be a shame. For the babysitter is not necessarily better, she just got there first. The children’s lack of sentimentality does not mean that the robots always come out ahead. After a long conversation about robot babysitters, Octavio, still dreaming of pasta instead of cereal, imagines how a robot might be programmed both to play with him and feed him “chicken and pasta because that is what you are supposed to have at night.” But Bridget dismisses Octavio’s plan as “just a waste. You could have just had a person.” Jude concurs: “What’s the point of buying a robot for thousands and thousands of dollars when you could have just kept the babysitter for twenty dollars an hour?”
DON’T WE HAVE PEOPLE FOR THESE JOBS?
 
Children speak fondly of their grandparents, whose care is often a source of family tension. Children feel a responsibility, and they want their parents to take responsibility. And yet, children see that their parents struggle with this. Might robots be there to fill in the gaps?
Some children are taken with the idea that machines could help with purely practical matters. They talk about a robot “getting my grandmother water in the middle of the night,” “watching over my grandmother when she sleeps,” and being outfitted with “emergency supplies.” The robots might be more reliable than people—they would not need sleep, for example—and they might make it easier for grandparents to continue living in their own homes.
But other children’s thinking goes beyond emergencies to offering grandparents the pleasures of robotic companionship. Oliver, the nine-year-old owner of Peanut the hamster, says that his grandparents are frail and don’t get out much. He considers in detail how their days might be made more interesting by an AIBO. But the robots might come with their own problems. Oliver points out that his grandparents are often confused, and it would be easy for them to confuse the robots. “Like, the old people might tell them [the AIBOs] the wrong people to obey or to do the opposite or not listen to the right person.” His sister Emma, eleven, sees only the bright side of a robotic companion. “My grandmother had a dog and the dog died before she did. My grandmother said she would die when her dog died.... I’m not sure that it is good for old people to have dogs. I think the AIBO would have been better for her.” Back in Miss Grant’s class, Bonnie thinks a robot might be the ultimate consolation. “If you had two grandparents and one died,” she says, “a robot would help the one that was alone.”
Jude, also in Miss Grant’s class, knows that his grandmother enjoys talking about the past, when she was a young mother, during what she calls “her happiest time.” He thinks that My Real Baby can bring her back to that experience. “She can play at that.” But it is Jude who first raises a question that will come to preoccupy these children. He thinks that his grandparents might
prefer
a robot to visits from a real baby.
Jude thinks aloud: “Real babies require work and then, well, they stop being babies and are harder for an older person to care for.” Jude says that while he and other kids can easily tell the difference between robots and a real baby, his grandparents might be fooled. “It will cry if it’s bored; when it gets its bottle, it will be happy.”
This association to the idea that robots might “double” for family members brings to mind a story I heard when I first visited Japan in the early 1990s. The problems of the elderly loomed large. Unlike in previous generations, children were mobile, and women were in the workforce. Aging and infirm parents were unlikely to live at home. Visiting them was harder; they were often in different cities from their children. In response, some Japanese children were hiring actors to substitute for them and visit aging parents.
2
The actors would visit and play their parts. Some of the elderly parents had dementia and might not have known the difference. Most fascinating were reports about the parents who knew that they were being visited by actors. They took the actors’ visits as a sign of respect, enjoyed the company, and played the game. When I expressed surprise at how satisfying this seemed for all concerned, I was told that in Japan being elderly is a role, just as being a child is a role. Parental visits are, in large part, the acting out of scripts. The Japanese valued the predictable visits and the well-trained and courteous actors. But when I heard of it, I thought, “If you are willing to send in an actor, why not send in a robot?”

Other books

Guarding January by Sean Michael
Winterbay by J. Barton Mitchell
Two Fridays in April by Roisin Meaney
Third Date by Leah Holt
Watch Me Disappear by Mulligan, Diane Vanaskie
The Gunsmith 386 by J. R. Roberts