Reclaiming Conversation (39 page)

Read Reclaiming Conversation Online

Authors: Sherry Turkle

BOOK: Reclaiming Conversation
5.55Mb size Format: txt, pdf, ePub

In school, when the app generation has to deal with unpredictability, they become impatient, anxious, and disoriented. At work, the problems continue. One new manager at HeartTech, the large software company in Silicon Valley, moved there so he could leave engineering and try his hand at management. “I left my previous job because it was too predictable. I wanted to work with unpredictable systems [here he means
people].” But he brings old habits with him: “I'm not really used to working with unpredictable systems. I'm not that good at thinking on my toes.” He elaborates: “I'm not used to thinking fast with people in front of me . . . the back-and-forth of conversation.”

His is a common plight. Engineers who move into management are asked to do a very different kind of work than that in which they were trained. They were groomed for today's scientific attitude toward management, one that encourages research and an instrumental and hard-edged view of the world. But in daily practice, what faces any manager is a life of hard calls, ambiguous situations, and difficult conversations. In the most concrete terms, there are performance reviews, negative feedback, firing people.

A human resources officer at a high-tech firm tells me: “The catchphrase among my peers is that ‘engineers will not deliver difficult conversations.'” In the high-tech world, when I raise the topic of conversation, that's the phrase I often hear back.

Difficult conversations require empathic skills and, certainly, “thinking on your toes.” Teaching engineers how to have these conversations requires significant coaching. Yet these days, as Gardner and Davis point out, a style of thinking that prefers the predictable extends beyond engineers.

It's not just the engineers who need coaching. As we all accept a more instrumental view of life,
we are all having trouble with difficult conversations.
In that sense, we are all engineers now. Our challenge is to deliver those difficult conversations, the ones that include others and the ones with ourselves.

Choose the right tool for the job.
Sometimes we find a technology so amazing—and a smartphone, for example, is so amazing—that we can't stop ourselves from imagining it as a universal tool. One that should, because it is amazing, replace all the tools we had before. But when you replace a conversation with an email just because you can, there is a good chance that you have chosen the wrong tool. Not because email isn't a great tool for some jobs. Just not for all jobs.

There is nothing wrong with texting or email or videoconferencing.
And there is everything right with making them technically better, more intuitive, easier to use. But no matter how good they get, they have an intrinsic limitation: People
require eye contact for emotional stability and social fluency
. A lack of eye contact is associated with depression, isolation, and the development of antisocial traits such as exhibiting callousness. And the more we develop these psychological problems, the more we shy away from eye contact. Our slogan can be: If a tool gets in the way of our looking at each other, we should use it only when necessary. It shouldn't be the first thing we turn to.

One thing is certain: The tool that is handy is not always the right tool. So an email is often the simplest solution to a business problem, even as it makes the problem worse. A text has become the handy way to end a relationship, even as it upsets and diminishes all participants. As I write this, a new robot has been launched on the market as a companion for your child. It will teach a child to look for understanding from an object that has none to give.

Learn from moments of friction.
We've met professionals who feel in conflict about the role technology plays in their lives. An enthusiast for remote work ends up taping the silence in his office and sending the audio file home to his wife. Architects build open workspaces even when they know that the people they design for crave more privacy. Young lawyers don't join their colleagues in the lunchroom even though they know that taking time for these meals would cement lifelong business relationships.

If you find yourself caught in this kind of conflict, pause and reconsider: Is your relationship to technology helping or hindering you? Can you recognize these moments as opportunities for new insight?

Remember what you know about life.
We've seen that we learn the capacity for solitude by being “alone with” another. And I've found that if we distract ourselves with technology during these crucial moments, even the most passionate proponents of always-on connection admit to doubt. So, when parents email during a child's bath time or text during a beach walk, the parents may persist in their behavior but they admit they are not happy. They sense they have crossed some line. One father
tells me that he takes his phone along when he and his ten-year-old son have a game of catch. The father says, “I can tell it's not as good as when I played catch with my dad.” Early in my research I meet a mother who has gotten into the habit of texting while she breast-feeds. She tells me simply, “This is a habit I might want to break.” It is a deeply human impulse to step back from these moments that endanger shared solitude.

Shared solitude grounds us.
It can bring us back to ourselves and others
. For Thoreau,
walking was a kind of shared solitude
, a way to “shake off the village” and find himself, sometimes in the company of others. In her writing about how people struggle to find their potential, Arianna Huffington notes the special resonance of Thoreau's phrase, for these days we have a new kind of village to shake off. It is most likely to be our digital village, with its demands for performance and speed and self-disclosure.

Huffington reminds us that if we find ourselves distracted, we should not judge ourselves too harshly. Even Thoreau became distracted. He got upset that when walking in the woods, he would sometimes find himself caught up in a work problem. He said, “But it sometimes happens that I cannot easily shake off the village. The thought of some work will run in my head and I am not where my body is—I am out of my senses. . . . What business have I in the woods, if I am thinking of something out of the woods?”

We know the answer to that question. Even if Thoreau's mind did sometimes travel to work or village, he accomplished a great deal on those walks. As in any meditative practice, the mind may wander, but then it comes back to the present, to the breath, to the moment. Even if he became distracted, Thoreau was making room for that. These days, we take so many walks in which we don't look at what is around us—not at the scenery, not at our companions. We are heads-down in our phones. But like Thoreau, we can come back to what is important. We can use our technology, all of our technology, with greater intention. We can practice getting closer to ourselves and other people. Practice may not make perfect. But this is a realm where perfect is not required. And practice always affirms our values, our true north.

Don't avoid difficult conversations
. We've seen that beyond our personal and work lives, we are having trouble talking to one another in the public square. In particular, we are having trouble with new questions about privacy and self-ownership.

I've said that these matters are examples of objects-not-to-think-with. They are characterized by a lack of simple connections between actions and consequences. There is a danger, but it is hard to define the exact damage you fear. Moreover, it is hard to know if the damage has already been done. These questions vex us and we are tempted to turn our attention elsewhere. Remember Lana, who was happy she didn't have anything controversial to say so she wouldn't have to confront that online life gave her no place to say it. She didn't want to have the conversation.

To encourage these conversations, it helps to avoid generalities. We claim not to be interested in online privacy until we are asked about specifics—phone searches without warrants, or data collection by the National Security Agency. Then it turns out
we are very interested indeed
.

One reason we avoid conversations about online privacy is that we feel on shaky moral ground. If you complain that Google is keeping your data forever and this doesn't seem right, you are told that when you opened your Google account, you agreed to terms that gave Google the right to do just this. And to read the content of your mail. And to build a digital double. And to sell its contents. Since you didn't actually read the terms of agreement, you begin the conversation disempowered. It seems that by agreeing to be a consumer you gave away rights that you might want to claim as a citizen.

But then, if we feel that our digital doubles incorrectly represent us or block us from the information we want,
we don't know how to object
. Should we be talking with the companies that track and commodify us? Should our conversation be with the government, who might be in a position to regulate some of this activity? Yet the government, too, is claiming its right to our data. We are shut down by not knowing an appropriate interlocutor, just as we are shut down by not knowing exactly what they “have” on us, or how to define our rights.

But just because these conversations are difficult doesn't mean they are impossible. They are necessary and they have begun. One conversation is about how to develop a realistic notion of privacy today. It clearly can't be what it was before. But that doesn't mean that citizens can live in a world without privacy rights, which is where conversations end up when they begin with the overwhelmed stance, “This is too hard to think about.”

One proposal from the legal community would shift the discussion from the language of privacy rights to the language of control over one's own data. In this approach, the companies that collect our data would have responsibilities to protect it—the way doctors and lawyers are bound to protect the
information we share with them
. In both cases, the person who provides the data retains control of how they are used.

And another conversation has grown up around transparency: How much do we have a right to know about the algorithms that reflect our data back to us? Being a smartphone user puts you in a new political class that has to learn to assert its rights.

An idea builds slowly. Those who take our data have one set of interests. We who give up our data have another. We have been led to believe that giving up our personal data is a fair trade for free services and helpful suggestions; this questionable notion of the fair trade has slowed down our ability to think critically.

It will take politicizing this conversation to get this conversation going. If it doesn't use a political language, a language of interests and conflicts, the conversation stalls—it moves to the language of cost-benefit analysis. Would you be willing to trade your privacy for the convenience of free email and word-processing programs? But the Constitution does not let us trade away certain freedoms. We don't get to “decide” if we want to give away freedom of speech.

And the conversation stalls if it moves too quickly to technical details. For example, I try to talk about the effect of knowing we are being tracked with a software engineer in his mid-sixties who has a special interest in public policy. I ask him, “Does tracking inhibit people's
willingness to speak their minds online?” His response is dismissive: “Don't they [the public] know that these algorithms are stupid? They are so bad . . . they don't mean anything.” This was meant as comfort to me. But it provided none. From his point of view, the discussion of individuals' rights to their personal data can be deferred because the algorithms that regularly invade individual privacy aren't “good enough.” “Good enough” for what?

Try to avoid all-or-nothing thinking.
The digital world is based on binary choice. Our thinking about it can't be. This is true whether we are talking about computers in classrooms, distance learning, or the use of teleconferencing in large organizations. But in all these arenas, when computational possibilities are introduced, camps form and the middle ground disappears.

The complexity of our circumstances calls for a flexibility of approach. But it is hard to summon. To return to the question of the Internet and privacy, one common reaction to how vulnerable we feel is to retreat to a position where any resistance is futile. When Internet companies are saving what you say, search, and share, you are offering up so much information that it begins to seem petty to object to any particular encroachment. It becomes like living in a city filled with security cameras and objecting to a particular camera on a particular street corner. So instead of talking about what should be our rights,
we adapt to rules we actually object to
.

Or, instead of talking about what our rights should be, we react with rigidity. When no one can think of a way to have
complete
online privacy, people start to say that no change will work unless it brings total openness. Technology critic Evgeny Morozov makes a pitch for a less binary view by considering the history of another by-product of technological progress: noise. An anti-noise movement in the early twentieth century insisted that noise was not just an individual problem but also a political one. And then the anti-noise campaigners compromised to achieve realistic goals that made a difference. Morozov says, “Not all of their reforms paid off, but the politicization of noise inspired a new generation
of urban planners and architects to build differently, situating schools and hospitals in quieter zones, and
using parks and gardens as buffers against traffic
.”

Just as industrialization “wanted” noise, the information society “wants”
uncontrolled access to data
. This doesn't mean they get to have everything they want.

The anti-noise campaigners didn't want to turn back industrialization. They didn't want silent cities, but cities that took the human need for rest, talk, and tranquillity into consideration. By analogy, in our current circumstance, we don't want to discard social media, but we may want to rewrite our social contract with it. If it operated more transparently, we might not feel so lost in our dialogue with it and about it. One way to begin this dialogue is to politicize our need for solitude, privacy, and mindspace.

Other books

Worth Dying For by Beverly Barton
Snow Angel by Chantilly White
The Psychopath Inside by James Fallon
El origen de las especies by Charles Darwin
31 Dream Street by Lisa Jewell
Canyon Walls by Julie Jarnagin
Sweet Backlash by Violet Heart