Authors: Nick Harkaway
The most acute statement of this truth was probably the 2002 Millennium Challenge, a US military exercise pitting the tiny tinpot nation of Red against the mighty forces of Blue. The challenge was planned in 2000, and Red was no doubt conceived as a generic Middle Eastern nation with a nod towards Iran, though by the time the exercise took place it was pretty clear that it was a dry run for a future invasion of
Iraq.
Saddam Hussein, it would seem, was the only person not really paying attention, because if he had been, the face of March 2003 might have been rather different.
Commanded by a retired marine officer,
Lieutenant General Paul van Riper, Red eschewed technological communications and used motorcycle couriers, sacrificing speed of communications for security. Van Riper launched a surprise attack against Blue’s fleet, coordinated apparently by an audio signal concealed in the muezzin, and sank most of Blue’s vessels. The situation was so desperate that after a while the Blue fleet was ‘refloated’ and van Riper himself was essentially removed from command so that Blue could win second time around.
Blue’s systemic problem, it later emerged, was literally that it had too much information. Van Riper, knowing he couldn’t hope to communicate in the field with his commanders in real time without those messages being disrupted or intercepted by Blue’s forces, had given his men a great deal of leeway within the
basic structure of his plans. They were therefore able to react locally and in sympathy with one another, if not actually in concert except where some major signal united their efforts. Blue, meanwhile, had a vast array of sophisticated information-gathering equipment which was used mistakenly to re-direct forces during the battle in real time. This meant Blue was constantly trying to cover all the bases, paralysed with a glut of data which must be interpreted and accounted for. Blue was also assuming a parallel command and control structure in Red’s forces, spending resources blocking transmissions, and presumably also trying to extrapolate a coordinated over-arching plan from the individual initiatives of Red’s distributed decision-making apparatus.
In other words, Blue was over-thinking it.
Although the first
Walkman – the device which ushered in the age of portable music, beginning with cassette tapes and moving on to CDs and MP3s – belonged to
Sony, the chief mover in the fetishization of the digital device since the turn of the century has been Apple, whose sleek, minimal designs have been masterfully injected into the consciousness of the high street with a mix of music, wit and supremely seamless functionality. Apple’s devices are not simply objects. They are gateways, leading to Apple’s liveried spaces in what is increasingly called
the
Cloud (only the US National Security Agency seems to use the term ‘
cyberspace’ any more). The Cloud is a vague collective noun referring to computers in any number of locations running services that can be accessed remotely. Google offers email, document storage, translation, search and a great deal more in the Cloud. Apple customers can buy media content with a single click. The next episode of a TV show, the next album, the next book is only ever a few moments away.
Apple’s Cloud presence is replicated in its steel and glass outlet
stores: a perfectly predictable and stylized shopfront which performs with a minimum of fuss. In 2000 the Canadian design guru
Bruce Mau described a selling environment in which ‘the brand identity, signage systems, interiors, and architecture would be totally integrated’. The point was the blurring of information and physical reality. The first Apple Store opened in May the following year – and then something else happened which was absolutely unexpected and appalling.
I can’t begin to unpick the interplay of the iPod’s launch in October 2001 – a month after the 9/11 attacks – with the slow, painful retrenching of American self-perception as being on top o’ the world. It seems facile, in the face of the falling towers, to wonder whether a small white box full of music became a part of the climb back out of the pit. And yet, if not music, what else do you fight horror with? It may be nonsense, suggested by the simple proximity of dates, or it may be an important part of the US relationship with the iPod – and, hence, everyone else’s too. Apple’s decision to go ahead with the launch must have been an almost impossibly hard one to make, but it was, in hindsight, the right one. Digital music went from being another format which might not catch on – like the MiniDisc player – to being the default format for many, myself included. Apple’s gizmo ushered in a new era of technology that was hot and cool at the same time, and – probably not coincidentally – set the stage for the arrival of multi-purpose pocket devices such as the iPhone, which in turn make possible the degree of integration of physical and digital space we’re now seeing, while at the same time opening all of us up, in our homes and our persons, to the tide of information that so upsets some of us.
The rise of Apple, along with Google and
Amazon – the latter two both begun in the 1990s but attaining titan status in the same decade – has brought us here, to a place where everything seemingly must be digitized, from libraries to shopping to medicine to streets and maps. The combination of functionality and cool has
made each new advance – the latest being Apple’s
Siri voice interface, which allows users to ask their phones questions in ordinary language and receive a spoken answer rather than engaging through a screen or keyboard – a must-have item, a consumer product and an identity statement as much as a simple tool. Some aspects of human life – a small number, but an increasing one – are now inaccessible without a smartphone. Our relationship with technology is no longer that of tool-user and tool; it is more complex and emotional. We replace things more often than we have to, and long before they are worn out, so as to be in possession of the latest thing, the cutting edge. (Although it’s fair to point out that our brains factor our habitual tools into our self-perception, so the connection between a craftsman and his awl has always been rather more profound than appearances might suggest.)
There is now such a thing as an ‘
unboxing’ – indeed, on
YouTube you can watch videos of people removing their new technological gear from its packaging. Writer Neal Stephenson describes one of his characters revealing a piece of kit in his novel
Snow Crash
; the experience is quasi-sexual. We have, in every sense, fetishized our technology.
We are also, as a culture – the Western world, from Berlin and Paris to Los Angeles and on to Sydney – somewhat addicted to notions of
apocalypse. Perhaps it’s because we’re also prone to lock-in; a crisis brings the opportunity to change the rules, to impose resolution on issues that otherwise simply fester. Politicians know this: witness the Neo-Conservative advance planning for a crisis that would allow the Republican Party to reshape the United States’ political landscape, which was then perfectly enabled by the unforeseen horrors of 9/11. In an apocalyptic scenario, all the usual rules can be re-examined, often to the great advantage of political leaders from one camp or the other.
In the present digital moment – the pre-crisis, perhaps – the
lock-in hasn’t set in across the board. There are still conflicting platforms for ebooks, for music; still conflicting operating systems, each representing a different philosophy and conferring power and responsibility on different groups. This is, obliquely, an extremely political situation. Governments and corporations are fighting it out with one another and with rebellious groups like
Eben Moglen’s
Freedom Box Foundation (which exists to bring uncensorable communication and government-proof encryption to the general population), and while various administrations in Europe and the US have arrogated to themselves the right to trawl through digital communication in search of various sorts of crime, those laws have not yet been thoroughly tested. It’s not clear who will own the technological landscape in different areas, although the time window is closing. We don’t yet need an apocalypse to change the rules, because the rules themselves are even now being defined, sometimes knowingly, sometimes not – by us. We are making the landscape, not watching it form.
It’s one of the most frustrating attitudes I see in my occasional side job as a commentator on the publishing industry’s conversion to the digital age: the natural tendency of large corporations appears to be to wait until the smoke clears and a leader emerges, then seek a deal with that person. The infuriating point is that publishing – like many other so-called ‘old’ industries – can’t afford to take this approach this time. It needs to have a hand in defining what happens, because otherwise it will likely be cut out.
The same is true with the rest of us: we can’t just sit back on this one and wait. The world is being made – or, rather, we, collectively, with our purchasing power and our unthinking decisions, are making it – and we have to choose to be part of that process, or else accept that what emerges from it may be a cold thing constructed not around human social life but around the imperatives of digitally organized corporate entities. It may
not happen that way on purpose, but the combination of commercialization, government involvement, litigation and societal forces – and the trajectory of digital technologies themselves as a consequence of what’s already happened – suggests to me that what takes place over the next few years, and what is happening now, will be definitive of how we approach and understand this area for the foreseeable future. To explain what I mean by that, I’m going to have to make a brief detour into the relationship between science, technology, society and the individual.
Marshall McLuhan famously asserted that ‘the medium is the message’. His meaning in essence was that the content of a given medium was irrelevant; the technology itself carried with it consequences that could not be denied or altered by what was communicated.
McLuhan’s perception – aside from being the kind of sweeping statement beloved of the Enlightenment and its ultimate modern prophets – is true only as far as it goes. A technology does, of course, shape society around it, but it is also created by that society in the first place and the lessons taken from it are inevitably filtered by cultural perceptions and understanding. It’s not just a praxis, in which ideas become things, but an ongoing, reflexive process in which each generation on one side of the reification divide yields a new generation on the other.
More simply: technology is derived from science. Science is the underlying knowledge; technology is what you then go ahead and do with that knowledge. If you have the science for television, do you create and implement a surveillance nation of closed circuit TV cameras, broadcast soap opera, or improve medical endoscopy? Your cultural bias will decide. (We’ve chosen to do all three in the UK. With the exception of the last one, it’s doubtful this strategy has greatly improved our lot.) Society, of course, is then influenced by the technology it has created. In
The Wealth
and Poverty of Nations
, David Landes discusses the impact of what he calls the first digital device – the mechanical clock.
The mechanical clock is obviously not digital in the sense of being electronic. Rather, it relies on a ‘regular … sequence of discrete actions’ to mark time in equal portions rather than following the flow of the natural day. Until it was developed, time, as experienced by humans, was fluid. In Europe, the churches marked the passing of time in each diurnal cycle with a sequence of masses, but the ‘hours’ were evenly distributed between day and night, no matter what the time of year. They therefore grew shorter as the winter came in and longer in high summer. Time was also centralized, up to a point: the local bells tolled the hours, rather than each individual person or household possessing the means to measure time. There was a time to wake, to trade, to sleep and so on, and all of them were announced by the tolling bells.
On the other hand, as Europe grew more populous and boundaries overlapped, time inevitably varied from place to place – from parish to parish – resulting in disputes. The invention of the mechanical clock, as with the arrival of mechanical printing, diminished the authority of the Church, allowing others to measure and set time. In effect, it also made possible the style of payment which for
Karl Marx was typical of capitalism: payment by the amount of time worked, rather than for the product of labour. The mechanical clock, in displaying or creating time as we understand it today, has influenced our understanding of work, and of the length of our lives. In allowing calculation of the longitude it also facilitated the growing naval and mercantile power of Europe, and in cutting the day up into fragments, it paved the way for Newton, Einstein and the rest to examine space and time and uncover the connections between them.