Replay: The History of Video Games (23 page)

BOOK: Replay: The History of Video Games
10.98Mb size Format: txt, pdf, ePub

Before the split, however, game companies saw the demoscene and its crackers as the enemy: law-breakers who smashed their expensive attempts to prevent illegal copying and gave away free copies of their games, cutting into sales and profits. “Piracy held the industry back,” said Bruce Everiss, who became the operations director of short-lived UK games publisher Imagine after selling off his Liverpool computer store Microdigital. “If no-one’s paying for stuff then stuff doesn’t get done. It’s that simple.” Not that the game industry’s dislike of the crackers resulted in any direct action. “Did game companies attempt to stop groups like 1001? Never,” said Hoing. “Police had other things to do than going after a bunch of kids cracking games.” The crackers were only just part of the widespread piracy of games in the 1980s. In schools across Europe children swapped games with abandon aided by the ease of tape-to-tape copying. “Anyone who was at school at that time will remember the swapping of games,” said Everiss. “It came from nowhere. One year people weren’t swapping games, the next year they were. You would only sell so many games because once it was out and about everyone swapped it.”

For Everiss, the sudden rise of schoolyard pirating of games killed Imagine, a Liverpool firm that dominated the UK games industry during its brief two-year existence. Founded by Mark Butler and Dominic Lawson, both former employees of Liverpool’s first game publisher Bug-Byte, Imagine achieved instant success with its debut release: a run-of-the-mill shoot ’em up called
Arcadia
.
Arcadia
became the best-selling Spectrum game of Christmas 1982 and turned Imagine from a start-up to one of the wealthiest game companies in Europe. But success was followed by excess. “It was really very, very heady,” said Everiss. “We were inventing the industry as we went along. Up until Imagine, the industry had been a kitchen-table industry. Imagine was the first UK company to have things like a sales team, marketing people. We were the first to do multi-lingual packaging. We put programmers into offices, which was a new thing and then started using sound and graphics artists.”

Imagine became living proof of the dream of many bedroom programmers: that they could get filthy rich making video games. The company’s plush offices boasted a garage filled with fast sports cars. At the age of 23 Butler was a symbol of the 1980s yuppie dream: a young man who had become rich through his entrepreneurialism. They formed their own advertising agency and started expanding across Europe. At one point they tried to rent the disused revolving restaurant on top of Liverpool’s St John’s Beacon tower, only to be put off by the excessive rent demanded by its landlord, the local council. “That was typical of us,” said Everiss. “We thought it would make a good executive office being up in the air going round and round in circles.” Most excessive of all was Imagine’s decision to pour huge sums of money into developing
Bandersnatch
and
Psyclapse
, which it described as the first ‘mega-games’. These games would come with hardware add-ons that, Imagine claimed, would enhance the abilities of home computers such as the Spectrum and usher in a new era in video games. It was not to be. In July 1984 Imagine went bust, its money drained away by over-expansion, the slow progress on developing the mega-games and falling sales due, at least in part, to piracy. The implosion was captured blow-by-blow by a BBC TV documentary crew who had set out to tell the story of Imagine’s success, but instead recorded its very public demise.

Imagine weren’t the only company to bite the dust around that time. The number of UK companies publishing Spectrum games peaked at 474 in 1984. The following year just 281 remained and by 1988 the number had tumbled to just 101. The industry became increasingly polarised between big publishers such as Ocean Software, who built business empires on the back of games based on blockbuster movies and popular TV shows such as
Robocop
,
Miami Vice
and
Knight Rider
, and budget publishers such as Mastertronic, which sold games for as little as £1.99 compared to the usual £8.99. By 1987 around 60 per cent of games sold in the UK were thought to be budget games. “At £1.99 it was hardly worth copying the game, you could have the real thing,” said Everiss. The middle ground of companies that released full-price original games steadily lost ground, unable to compete on price or recognition. “At one stage we tried to launch a mid-prie range and were just stuck in the middle. It was difficult, you had to be in one camp or the other,” said David Darling, who founded Warwickshire-based budget game publisher Codemasters with his brother Richard in 1985.

The same was starting to happen in France. Infogrames, whose founders were laughed at by French venture capitalists when they asked for investment back in 1983, swallowed up Cobra Soft as well as Ere Informatique. Meanwhile, Guillemot Informatique, a leading distributor of computer equipment based in Montreuil, launched a game publishing business called Ubisoft in 1986 that quickly expanded across Europe. Both Infogrames and Ubisoft would go on to become multinational gaming giants. The wilder elements of Europe’s early games industry started to leave the business. Surrealist game maker Mel Croucher sold off his game company Automata UK for 10 pence in 1985, while Jean-Louis Le Breton quit games to become a journalist. The European industry was growing up. Companies merged, expanded, created marketing teams and professionalised. Soon the games business was dominated by companies such as Ocean, Infogrames and US Gold, a UK publisher that rose to prominence converting American games onto home computers that were popular in Europe.

Formed in Birmingham by Geoff Brown, a former teacher and singer in progressive rock band Galliard, US Gold was a triumph of business nous over creativity. Brown bought his first home computer, an Atari 800, just as home computers began to take off in the UK. “There weren’t many people in the UK owning an Atari, so those who did were enthusiasts and if you were an enthusiast you were prepared to look for the games,” he said. “I got hold of a US magazine called
Compute!
that had all these wonderful games I had never heard of. The screenshots looked brilliant, so I thought I’m going to get myself one of those.” The game he chose was
Galactic Chase
, a 1981 game from Stedek Software. It was a straightforward copy of the arcade game
Galaxian
, but its production quality was miles ahead of what was being developed in the UK. “A lot of the UK programmers were still writing in BASIC. These guys were writing totally in machine language,” said Brown. “It was light years ahead of anything the UK was doing.”

After making some money importing
Galactic Chase
to the UK, Brown bought an airplane ticket and headed to the US to sign up more of the games being made by the North American computer game business that had come to the fore after the spectacular collapse of Atari.

[
1
]. A revolutionary group of French artists, philosophers and academics that began as an artistic movement but evolved into a political movement led by Guy Debord, a French intellectual and war game enthusiast. Debord’s manifesto
The Society of Spectacle
summed up the movement’s politics with its theory that people had become spectators in their own lives.

[
2
]. Adventure, humour, leftfield an‘a willingness to making fun of anything’.

[
3
]. Thomson’s computers became France’s equivalent of the UK’s BBC Micro after the French government made them the basis of a national programme to put computers in every school.

[
4
]. The word schriften in the watchdog’s original name referred to print or printed media although the law that created the regulatory body never limited its role to this.

[
5
]. Crackers often sought to compress games into smaller amounts of memory, so they took less time to download from bulletin board systems or loaded quicker.

Arcade action: A B
ritish teenager tries out Yu Suzuki’s 1989 coin-op
Turbo Out Run
. Paul Brown / Rex Features

11. Macintoshization

One afternoon in 1975 a Harvard University student decided to write a seven-year plan that would result in the birth of one of the world’s biggest game publishers. It may have been the days of
Pong
but Trip Hawkins, the student in question, was already electrified by the new world of video games. “From the moment I saw my first computer in 1972, I knew I wanted to make video games,” he said. “I had a strong feeling that people were meant to interact, not to sit passively like plants in front of the TV. I was already designing board games but saw instantly that a computer would allow me to put ‘real life in a box’.” Just before he wrote his plan, Hawkins had read about the opening of one of the first computer stores and the Intel microprocessor, a computer-on-a-chip. He knew then that video games would become a mainstream form of entertainment. Technology, however, was against him. The computers of the day were still too expensive and too primitive to allow Hawkins to realise his dreams.

So instead Hawkins decided to spend the next seven years preparing for 1982, the year he believed would be the moment when technology would have caught up with his dreams. “By then, I figured, there would be enough hardware in homes to support a game software company,” he said. He adhered to his plan with religious devotion. He tailored his degree in strategy and applied game theory so that he could learn how to make video games. He took an MBA course to get the business skills he needed to run his future company and carried out market research into the computer and games console business. In 1978 he joined Apple Computer where he honed his business skills and, thanks to the stock options he got when the company floated on the stock exchange in 1980, the funds he needed to start his game business. “I made enough in my four years at Apple to know I could completely fund the company if I wanted,” he said.

And as 1981 came to a close, Hawkins was finally ready, but by then the video game boom was already well under way. “I actually felt late,” he said. “Because of the success of Atari’s early hardware and a cottage industry of Apple II software companies, I counted 135 companies already making video games but I had a unique vision and thought I could compete and become one of the leaders. This is what happens to you after you hang around with Steve Jobs for a few years.”

Sticking rigidly to his plan, Hawkins quit Apple on New Year’s Day 1982 and set about forming Electronic Arts. Hawkins’ vision for Electronic Arts echoed
the old Hollywood studio system that emerged in the 1920s, with its plan to control game development, publishing and distribution. Electronic Arts would make games on multiple platforms, package them in boxes not plastic bags, and distribute them direct to retailers. It would also promote its game designers as if they were movie directors – artistic visionaries of the new era of interactive entertainment. The company’s publicity materials set out its ‘games as art’ rhetoric: “We are an association of electronic artists united by a common goal. The work we publish will be work that appeals to the imagination as opposed to instincts for gratuitous destruction.” Other publicity materials asked “can a computer make you cry?” and promised games that would “blur the traditional distinctions between art and entertainment and education and fantasy”.
[1]

But by the time Electronic Arts released its first games on 21st March 1983, the North American game business was going down the tubes. “Atari officially crashed in December 1982,” said Hawkins. “The media, retailers and consumers vacated the console market in 1983, leaving Electronic Arts in a void. Start-ups like Electronic Arts had to focus on the Apple II, Commodore 64, etc. But those markets never got very big because the computers were more expensive and harder to use. They were really a hobby market more than a consumer market.” The post-Atari world of the home computers was an inhospitable landscape for those hoping to make a livelihood out of video games. “It was a brutal time,” said Bing Gordon, Electronic Arts’ head of marketing and product development at the time. “We entered the dark ages of interactive entertainment. The five years between 1982 and 1987 were hard, hard, hard. Each Christmas, all the experts at leading newspapers reminded potential customers that the video game business had died with Atari and would never return.”

What market did exist was splintered; fragmented across myriad home computer systems each with different technology and capabilities. It was also a market riddled with piracy, unlike the cartridge-based consoles of old. “People would steal your game. They wouldn’t buy it, they would copy it,” said Rob Fulop, a game designer at Imagic, the former console starlet that tried unsuccessfully to survive the crash by making computer games. The differences between the hardware of computers and consoles, meanwhile, required game designers to rethink their work. Controls shifted from joysticks to keyboards. Games moved from being stored on microchips in cartridges to floppy disks. “You had long load times, a lot more memory and higher resolution visuals than you did on video game consoles,” said Don Daglow, who became a producer for Electronic Arts after Mattel abandoned the Intellivision console. “You had the ability to save a game on disk, so we could do games that could take longer because you could save. Floppy disks allowed us to be more ambitious.” But computers were also slower. re alGame companies had been concentrating on action games for consoles and computers weren’t fast enough at that time to really do a good job with an action game,” said Michael Katz, who quit Coleco as the crash set in to become the president of San Francisco-based computer game specialists Epyx.

Home computer users were also a different type of consumer compared to the console owners game companies grew up with. They were older, more educated
and more technically minded.
[2]
“The video games before the crash were all specifically directed at young people, while computer games were directed at an older audience,” said Chris Crawford, who became a freelance game designer after Atari’s implosion. The differences in hardware and consumer tastes led game designers to move away from action games towards more cerebral, complex and slower forms of game. “Games prior to the crash sought to appeal to the mass market, but post-crash games became increasingly geared towards dedicated game players who wanted complexity and this further alienated the non-hardcore audience,” said David Crane, co-founder of game publisher Activision.

Most of Electronic Arts’ debut games reflected this new era of complexity. Foremost among these games were
M.U.L.E.
and
Pinball Construction Set
.
M.U.L.E.
was a computerised multiplayer board game based on supply and demand economics that cast players as colonisers of a faraway planet, trying to scratch a living. Its transgender creator Dan Bunten, who later became Dani Bunten Berry after a sex change, drew inspiration from
Monopoly
and Richard Heinlein’s novel
Time Enough for Love
, a sci-fi retelling of the trials of America’s old west pioneers. In the game each of the four players commandeered plots of land to produce energy, grow food and mine ore in a bid to become the richest. But while
Monopoly
was about cut-throat competition,
M.U.L.E.
was tempered by the need for players to work together to ensure there was enough food and energy for all of them to survive.
M.U.L.E.
was a commercial failure, but its careful balance of player competition and co-operation made it a seminal example of multiplayer gaming.

Pinball Construction Set
on the other hand used the memory and save features of computers to let people design and play their own pinball tables. Together with the same year’s
Lode Runner
, a platform game with a level-creation tool, it pioneered the idea of allowing players to create game content – a concept that would be taken further by games such as
Quake
and
LittleBigPlanet
.
Pinball Construction Set
’s creator Bill Budge came up with the idea after spending some time working for Apple: “The people at Apple liked to go and play pinball at lunch – it was a big fad at the time. The engineers would spenme perfecting their moves on these pinball machines – typical obsessive-compulsive programmer behaviour. I would go with them and watch. It occurred to me you could make a pinball game on the Apple II.” The result was 1981’s
Raster Blaster
, a pinball game based on a single table, that Budge released through his own company BudgeCo. He then figured that a pinball game that let people create new tables would be even better and. thanks to his time at Apple, he knew exactly how the table-creation element should work. “I was watching the Macintosh develop and I was really familiar with the Lisa. That introduced me to the graphical user interface and how cool all that was,” he said. “I thought you could do a lot of the same stuff on the Apple II.”

The Lisa, and its still-in-development successor the Macintosh, were Apple’s latest computers. Both used a new approach to computer interfaces: the graphical user interface or GUI. The concept of the GUI dated back to 1950 when electric engineer Douglas Engelbart concluded that computers would be easier to use if people interacted with them via television screens rather than keyboards, punch cards or switches. But in an era where computers and television were still so new, his ideas were dismissed as bizarre and unrealistic. Then the Cold War intervened.

In August 1957 the Soviet Union launched the first successful intercontinental ballistic missile and on the 4th October that same year launched the world’s first artificial satellite Sputnik 1 into orbit. The next step was obvious: putting nuclear warheads on intercontinental ballistic missiles. The US government responded by forming the Advanced Research Projects Agency (APRA) to bankroll research to help the US regain its technological superiority over its superpower rival. And in 1964 APRA decided to fund Engelbart’s research to the tune of $1 million a year. Using the money Engelbart created the GUI, the basis of almost every computer since the mid-1990s. He invented the mouse, the idea of windows that users could reshape and move around the screen, designed the word processor, came up with the concept of cutting and pasting, and devised icons that could be pointed at and clicked on using the mouse. In short, he produced the template for modern GUIs such as Microsoft Windows and Mac OS.

In 1973 the Xerox PARC research institute in Palo Alto used Engelbart’s ideas to come up with the Alto, one of the earliest GUI computers. Xerox did little to turn the Alto into a commercial product, but when Apple co-founder Steve Jobs paid a visit to the facility he saw the potential of the GUI. Apple’s first attempt at a GUI-based computer, the Apple Lisa, went on sale in 1983. It introduced Engelbart’s concepts to a wider audience but its high price – $9,995 – meant it was a commercial failure. The following year, however, Apple tried again with the Apple Macintosh. Unlike the Lisa, the $1,995 Macintosh made an immediate and lasting impact. For those used to the unfriendly and intimidating computers of the late 1970s and early 1980s it was a liberating moment.

“The human interface of a computer as we know it today, with windows and a mouse, was new to the world of personal computers when the Lisa and Mac came
out,” said Darin Adler, a programmer at Illinois game developers ICOM Simulations. The Macintosh also led a revolution in computer design as Apple’s rivals began to create GUIs for their next generation home computers.
[3]
The Macintosh was also a big influence on game designers, many of whom saw GUIs aay to make more complex games easier to understand. Its influence was such that
Computer Gaming World
journalist Charles Ardai argued that video games were undergoing a process of ‘Macintoshization’. “GUIs served to regularise the interface and make it a bit more indirect,” said Crawford. “Most games had direct interfaces: push the joystick left and your player moved left. GUIs moved us a bit further towards abstraction by putting some of the verbs onscreen as buttons or menus. This in turn greatly expanded the size of the verb list that we could present to the player.”

Other books

How to Marry a Marquis by Julia Quinn
Murder as a Fine Art by John Ballem
Rugged and Relentless by Kelly Hake
Dark Rosaleen by Michael Nicholson, OBE
First Times: Megan by Natalie Deschain
Meddling in Manhattan by Kirsten Osbourne
The Fifth Kiss by Elizabeth Mansfield
Desperate Seduction by Alyssa Brooks
Bring Home the Murder by Jarvela, Theresa M.;