Authors: Nick Harkaway
At home, issues of race, religion, sexuality and gender remain poisonous, our governments demand greater and greater rights of surveillance over our lives and wish to curtail week by week the
historic freedoms of assembly and speech which have marked our culture’s development. Trial by jury,
habeas corpus and the rules of evidence are constantly assailed, as is the independence of the judiciary. In the aftermath of the riots which took place in the UK in the summer of 2011,
David Cameron vowed that he would crack down on ‘phoney human rights’, which seems to mean any rights that are not convenient. At the same time, and despite evidence that it was both impractical and counterproductive, some MPs began to call for the government to be able to ‘pull the plug’ on the Internet and the cellphone network in times of civil unrest; a weird, desperate grasping for centralized power and control which seems alien to a modern government.
Perhaps in consequence of this kind of disconnection, politicians are perceived as mendacious, governmental press releases as spin. The professional political class, in return, describes the electorate as apathetic, or unable to comprehend the issues. The standard response to a public outcry is not ‘we’ll change the policy’ but ‘clearly we’ve failed to get our plan across properly’. In the UK under the Blair government, two of the largest political demonstrations in modern British history took place on consecutive weekends – one against a ban on fox-hunting and another against the Iraq War – and were parlayed against one another into a stasis which meant both could be ignored. More generally, serious issues often go untackled or botched from fear of expending political capital on unpopular legislation in the face of tabloid scorn. Extremist political views are becoming more popular across Europe and in the US as mainstream political parties fail to speak substantively about what is going on, preferring instead to throw mud at one another.
In other words, before we start to look at possible digital apocalypses, we have to acknowledge that the general picture is a lot less rosy than we tell ourselves when we’re brushing our teeth in the morning. In fact, we stagger from one crisis to the next, and while we are insulated in the industrialized world from
some of them, we are by no means immune. Our prosperity and our civilized behaviour are fragile, our world is unequal and – for billions – bleakly callous.
The opposing extremes I described – total immersion and passivity, and utopian liberty and creativity – are both unlikely. Patchwork is more probable than purity; if the late modern (the term post-modern has a number of meanings in different disciplines, some specific and others irksomely vague, and in any case suggests that we’re in some kind of afterparty of world history, which I think is untrue, so I use late modern, which means more or less what it sounds like and doesn’t instantly cause me to break out in sociological hives) condition we inhabit has any rules, that must be one of them: everything is muddled together. What is unclear and indeed undecided is which futures will spread out and flourish and which will fade away. But neither extreme is technologically or societally impossible. We live in a time when boundaries of the possible are elastic, while our unconscious notions of what can and cannot be done remain lodged in a sort of spiritual 1972. Unless we can change that, we’re going to find the next twenty years even more unsettling than the last. Abandon, please, the idea that no one will ever be able to connect a computer directly with the human mind and consider instead what will happen when they do, and what choices we might – must – make to ensure that when it becomes common practice the world is made better rather than worse.
Only one thing is impossible: that life should remain precisely as it is. Too many aspects of the society in which we presently live are unstable and unsustainable. Change is endemic already, but more is coming. This is for various reasons a time of decision.
A word about navigation:
The first section of this book deals with the common nightmares of digitization and attempts to assess how seriously we
should take them and whether they really derive from digital technology or from elsewhere. It contains a potted history of the Internet and a brief sketch of my own relationship with technology from birth onwards, and asks whether our current infatuation with all things digital can last. It also examines the notion that our brains are being reshaped by the digital experience, and considers our relationship with technology and science in general.
The second section considers the wrangles between the digital and the traditional world, looks at the culture of companies and advocates for digital change, and the advantages and disadvantages of digital as a way of being. It deals with notions of privacy and intellectual property, design and society, revolution and riot, and looks at how digitization changes things.
The third section proposes a sort of self-defence for the new world and a string of tricks to help not only with any digital malaise but also with more general ones. It asks what it means to be authentic, and engaged, and suggests how we go forward from here in a way that makes matters better rather than worse (or even the same).
More generally: it is inevitable that I will be wrong about any number of predictions. No book which tries to see the present and anticipate the future can be both interesting and consistently right. I can only hope to be wrong in interesting ways.
I
WAS BORN
in 1972, which means I am the same age as the first ever video game, Pong. I actually preferred Space Invaders; there was a classic wood-panelled box in the kebab shop at the end of my road, and if I was lucky I’d be allowed a very short go while my dad and my brother Tim picked up doner kebabs with spicy sauce for the whole family. In retrospect, they may have been the most disgusting kebabs ever made in the United Kingdom. When the weather’s cold, I miss them terribly.
I grew up in a house which used early (room-sized) dedicated word-processing machines. I knew what a command line interface was from around the age of six (though I wouldn’t have called it that, because there was no need to differentiate it from other ways of interfacing with a computer which did not yet exist: I knew it as ‘the prompt’, because a flashing cursor prompted you to enter a command) and since my handwriting was moderate at best I learned to type fairly early on. Schools in London back then wouldn’t accept typed work from students, so until I was seventeen or so I had to type my work and then copy it out laboriously by hand. Exactly what merit there was in this process I don’t know: it seemed then and seems now to be a form of drudgery without benefit to anyone, since the teachers at the receiving end inevitably had to decipher my appalling penmanship, a task I assume required a long time and a large glass of Scotch.
On the other hand, I am not what was for a while called a ‘digital native’.
Cellphones didn’t really hit the popular market until the 1990s, when I was already an adult; personal computers were fairly unusual when I was an undergrad; I bought my first music in the form of vinyl LPs and cassette tapes. I can remember the battle between Betamax and VHS, and the arrival and rapid departure of LaserDisc. More, the house I lived in was a house of narratives. More than anything else, it was a place where stories were told. My parents read to me. My father made up stories to explain away my nightmares, or just for the fun of it. We swapped jokes over dinner, and guests competed – gently – to make one another laugh or gasp with a tall tale. Almost everything could be explained by, expressed in, parsed as, couched in a narrative. It was a traditional, even oral way of being, combined with a textual one in some situations, making use of new digital tools as they arrived, drawing them in and demanding more of them for the purpose of making a story. We weren’t overrun by technology. Technology was overrun by us.
All of which makes me a liminal person, a sort of missing link. I have one foot in the pre-digital age, and yet during that age I was already going digital. More directly relevant to this book, my relationship with technology is a good one: I am a prolific but not excessive user of Twitter; I blog for my own website and for another one; I have played World of Warcraft for some years without becoming obsessive (I recently cancelled my subscription because the game has been made less and less sociable); I use Facebook, Google+, GoodReads and tumblr, but I am also professionally productive – since my first book came out in 2008, I have written three more, along with a screenplay and a number of newspaper articles. I am also a dad, an occasional volunteer for the charity of which my wife is director, and I have the kind of analogue social life everyone manages when they are the parent of a baby; so aside from whatever moderate brainpower I can bring to bear on this topic, I can speak with the authority of
someone who manages their balance of digital and analogue life pretty well.
I am, for want of a better word, a digital yeti.
In the late 1950s and early 1960s, when my older brothers were being born, the Defense Advanced Research Projects Agency (DARPA) in the US planted the seed of the modern Internet. The network was constructed to emphasize redundancy and survivability; when I first started looking at the history of the Internet in the 1990s, I read that it had grown from a command and control structure intended to survive a nuclear assault. The 1993
Time
magazine piece by
Philip Elmer-DeWitt, which was almost the Internet’s coming-out party, cited this origin story alongside
John Gilmore’s now famous quote that ‘The Net interprets censorship as damage and routes around it’. Although DARPA itself is unequivocally a military animal, this version of events is uncertain. It seems at least equally possible that the need for a robust system came from the unreliability of the early computers comprising it, which were forever dropping off the grid with technical problems, and that narrative is supported by many who were involved at the time.
That said, it’s hard to imagine DARPA’s paymasters, in the high days of the Cold War and with a
RAND Corporation report calling for such a structure in their hands, either ignoring the need or failing to recognize that a durable system of information sharing and command and control was being created under their noses. For whatever it’s worth, I suspect both versions are true to a point. In either case, the key practical outcome is that the Internet is in its most basic structure an entity that is intended to bypass local blockages. From its inception, the Internet was intended to pass the word, not ringfence it.
The seed grew slowly; at the start of 1971 there were just fifteen locations connected via the ARPANET (Advanced Research
Projects Agency Network). Through the 1970s and 1980s, growth came not from a single point but rather from many; educational networks such as the
National Science Foundation Network and commercial ones such as Compuserve met and connected, using the basic protocols established by DARPA so that communication could take place between their users. I remember a friend at school, a maths whizz whose father was an industrial chemist, patiently logging in to a remote system using a telephone modem: you took the handset of your phone and shoved it into a special cradle and the system chirruped grasshopper noises back and forth. Eventually – and it was a long time, because the modem was transmitting and receiving more slowly than a fax machine – a set of numbers and letters in various colours appeared on a television screen. I could not imagine anything more boring. I asked my friend what it was, and he told me he was playing chess with someone on the other side of the world. He had a physical chessboard set up, and obediently pushed his opponent’s piece to the indicated square before considering his next move. Why they didn’t use the phone, I could not imagine.
Around about the same time, my mother and I went to an exhibition of some sort, and there was a booth where a computer took a picture of you and printed it out, using the letters and numbers of a daisywheel printer, double-striking to get bold text, because the inkjet and the laser printer were still years away. The resulting image was recognizably me, but more like a pencil sketch than a photo. It was considered hugely advanced.