Return to the Little Kingdom (48 page)

BOOK: Return to the Little Kingdom
2.71Mb size Format: txt, pdf, ePub
EPILOGUE
M
ore than a quarter of a century has passed since I wrote the previous page on an Apple III computer. In 1984, as the first edition of this book made its way to the press, I received several letters from the publisher—those being the days before email had become the universal telegraph system—expressing anxiety that Apple’s day in the sun might already have passed. The apprehension was understandable. The hullabaloo surrounding the introduction of the Macintosh—trumpeted with an Orwellian television commercial on Superbowl Sunday 1984—had evaporated, and the notices had turned sour. IBM’s personal computer business was gaining strength. Compaq had reached $100 million in sales faster than any previous company and Microsoft’s operating system, DOS, was winning licensees by the month. There were plenty of reasons to think that Apple was teetering.
Twenty five years later, when people are as familiar with the names iPod, iPhone, or Macintosh as they are Apple, it is hard, particularly for those reared on cell phones and social networks, to imagine a time when the company appeared to be just another technology firm that would be snuffed out or absorbed by a competitor. Since 1984, there have been plenty of technology companies that have faded to grey or gone to black, and it’s remarkably easy to come up with an alphabetical list for these casualties that runs from A to Z.
The letter “A” alone includes Aldus, Amiga, Ashton-Tate, AST and Atari. As for the rest of the alphabet there’s always Borland, Cromemco, Digital Research, Everex, Farallon, Gavilan, Healthkit, Integrated Micro Solutions, Javelin Software, KayPro, Lotus Development, Mattel, Northstar Computers, Osborne Computer, Pertec, Quarterdeck, Radius, Software Publishing, Tandy, Univel, VectorGraphic, Victor, WordPerfect, Xywrite and Zenith Data Systems. The large technology companies that have weathered these decades—IBM and HP—have done so in areas far removed from personal computing. IBM, once the company that others in the personal computer industry feared, has even surrendered its franchise to the Chinese company Lenovo.
The mortality rate makes Apple’s survival—let alone prosperity—even more remarkable. I’ve watched Apple, first as a journalist and later as an investor, for most of my adult life. Journalists suffer from the malady of not forgetting a topic that once interested them. I’m no different. But a couple of years after I finished writing this book I found myself, thanks to some twists of fates, working at Sequoia Capital, the private investment partnership whose founder, Don Valentine, had helped assemble some of the formative blocks on which Apple was built. Since then, as an investor in young technology and growth private companies in China, India, Israel and the U.S., I have developed a keener sense for the massive gulf that separates the few astonishing enterprises from the thousands that are lucky to scratch out an asterisk in the footnotes of history books.
In 1984, if most consumers had been asked to predict which company—SONY or Apple—would play a greater role in their lives, I wager most would have voted for the former. SONY’s success rested on two powerful forces: the restless drive of its founder, Akio Morita, and the miniaturization of electronics and products consumers yearned for. The Japanese company, which had been formed in 1946, had built up a following as a designer and maker of imaginative and reliable consumer electronic products: transistor radios, televisions, tape recorders and, in the 1970s and 1980s, video recorders, video cameras and the WalkMan, the first portable device to make music available anywhere at any time of day. Like the iPod, a generation later, the Walkman bore the stamp of the company’s founder. It was created in a few months during 1979, it built its following largely by word of mouth and in the two decades prior to the advent of mp3 players sold over 250 million units. Now, as everyone knows, the tables have been turned and some years ago a cruel joke circulated which spelled out the change in circumstances, “How do you spell SONY?” The answer: “A-P-P-L-E.”
This begs the question of how Apple came to outrun SONY, but the more interesting topic is how the company came to rattle the bones of mighty industries and has forced music impresarios, movie producers, cable television owners, newspaper proprietors, printers, telephone operators, yellow page publishers and old line retailers to quaver. None of this seemed possible in 1984 when Ronald Reagan was President, half of American households tuned into the three television networks, U.S. morning newspaper circulation peaked at 63 million; LPs and cassette tapes outsold CDs by a margin of ninety to one; the Motorola DynaTAC 8000x cell phone weighed two pounds had thirty minutes of talk time and cost almost $4,000; Japan’s MITI was feared in the West; and the home of advanced manufacturing was Singapore.
Three mighty currents have flowed in Apple’s favor, but these waters were also navigable by other crews. The first swept electronics deeper into every nook and cranny of daily life so that now there is almost no place on earth beyond the reach of a computer or the bewildering collection of phones and entertainment devices with which we are surrounded. The second has made it possible for companies born in the era of the personal computer to develop consumer products. It has been far easier for computer companies with refined software sensibilities to design consumer products than for those whose lineage was consumer electronics and whose expertise lay largely in hardware design and manufacturing prowess. It’s not a coincidence that some of the companies with the acutest envy towards Apple have names like Samsung, Panasonic, LG, Dell, Motorola and, of course, SONY. The third current was “cloud computing”—the idea that much of the computation, storage and security associated with popular software sits in hundreds of thousands of machines in factory-sized data centers. This is the computer architecture that, in the mid 1990s, supported services such as Amazon, Yahoo! eBay, Hotmail and Expedia and later came to underpin Google and the Apple services that light up Macs, iPods and iPhones. Today, for the first time, consumers—not businesses or governments—enjoy the fastest, most reliable and most secure computer services.
In 1984 more immediate and mundane challenges confronted Apple. Faced with the challenge of managing a fast growing company in an increasingly competitive business, the Board of Directors of Apple, were faced with the most important task that confronts any board: selecting a person to run the company. Mike Markkula, who had joined Jobs and Wozniak in 1976, had made no secret of the fact that he had little appetite for life as Apple’s long-term CEO. Thus the Board, which included Steve Jobs, had to decide what course to take. This decision—and three similar decisions over the ensuing thirteen years—shaped Apple’s future.
Only in retrospect have I come to understand the immense risk associated with hiring an outsider—let alone a person from a different industry—to run a company whose path has been heavily influenced by the determination and ferocity of its founder or founders. It is not an accident that most of the great companies of yesterday and today have, during their heydays, been run or controlled by the people who gave then life. The message is the same irrespective of industry, era or country and the name can be Ford, Standard Oil, Chrysler, Kodak, Hewlett-Packard, WalMart, Fedex, Intel, Microsoft, NewsCorp, Nike, Infosys, Disney, Oracle, IKEA, Amazon, Google, Baidu or Apple. The founder, acting with an owner’s instincts, will have the confidence, authority and skills to lead. Sometimes, when the founder’s instincts are wrong, this leads to ruin. But when they are right, nobody else comes close.
When corporate boards start to have misgivings about the condition of a company or the ability of the founder and have no plausible internal candidate, they will almost always make the wrong move. They usually need to make a decision about a CEO when a company is barreling towards a fall, emotions are raw, testosterone levels are running high and, particularly in a company as visible as Apple, when every employee, analyst, smart-aleck and naysayer is ready to dispense advice. At Apple in 1985 the Board’s decision was complicated by the fact that there was no obvious successor within the company. Jobs was considered too young and immature, and for his part, he knew that he needed help if Apple was to achieve the $10 billion sales level he had already started to dream about. The oppressive weight of conventional wisdom tilted the quest towards a résumé dripping with impressive-sounding titles and credentials. But experience—particularly when it’s been acquired in a different industry—is of little use in a young, fast-growing company in a new business that has a different pulse and unfamiliar rhythm. Experience is the safe choice, but is often the wrong one.
After a lengthy search, Apple’s board announced that John Sculley would be the company’s new Chief Executive. Sculley was unknown in Silicon Valley, which was hardly surprising since he had spent his entire business career at Pepsi Cola where, in his final job, he had run its soft-drinks business, PepsiCo. Sculley’s arrival in Cupertino was greeted with the demeaning commentary that Apple (and Jobs) “needed adult supervision.” This is the very last thing that rare and wonderful founders need. These rare sorts of people may require help, they will certainly benefit from assistance and there may be plenty of things that are new or foreign to them. But the appearance of a boss, particularly one with little experience of technology and the brutish rough and tumble of a company in its formative years, will almost certainly end in misery.
At Apple, Sculley was greeted like an archangel and, for a time, could do no wrong. He and Jobs were quoted as saying that they could finish each others’ sentences. In hindsight it is fairly easy to say that it would be almost impossible for a man like Sculley, reared within the confines of an established East Coast company selling soft drinks and snacks, to flourish in a business where product life cycles are measured in quarters, if not months, and where cowing to convention marks the start of the death rattle. It is easier for a founder, particularly when surrounded by people with different experiences, to learn about management than for a manager from a large company to master the nuances and intricacies of an entirely new business—especially if that happens to be a technology company.
Within less than two years familiarity began to breed contempt—a situation complicated by the fact that while Sculley bore the CEO title, Jobs was the Chairman of the company. Disagreements occurred. Sniping and backbiting broke out and the dissension became so intense that in 1987 Sculley, disgruntled, displeased, exasperated and exhausted by Jobs, orchestrated the latter’s dismissal from the company. Sculley’s tenure at Apple lasted until 1993, and for part of that time the external reviews, at least as posted by Wall Street analysts, were favorable.
In the decade Sculley spent at Apple sales grew from less than $1 billion a year to more than $8 billion a year. On the surface, this looks like a wonderful record. But the reality was far different. Sculley benefited from a powerful force—the massive demand for personal computers. This sort of market growth conceals all types of shortcomings, and it is only when the rate of change slows or the economy contracts that the real cracks become visible.
During Sculley’s time at Apple, the company was outgunned by the brute force of IBM, then by the cunning maneuvering of the industry’s arms merchant, Microsoft, which made the operating system that it had licensed to IBM available to all comers. This led to a proliferation of what were labeled “IBM compatibles”—some made by startups like Compaq, others by established players like DEC and still more from cost-conscious Taiwanese companies such as Acer. These machines shared two traits: the hardware was built around microprocessors from Intel, and their operating systems were furnished by Microsoft. Apple, in the meantime, counted on chips from Motorola (and later IBM) and had to labor hard to convince programmers to write software for the Macintosh, whose market share dwindled as the years slipped by. Apple was fighting on two fronts with weak allies against the vast budget of Intel—in an industry where engineering and capital counted for a lot—and the legions of programmers who had discovered they could build a business atop Microsoft DOS and its successor operating system, Windows. Part of Sculley’s response was to gradually increase Apple’s prices in an effort to maintain profit margins—a ploy that propped up earnings for a while but eventually foundered.
While pesky newcomers attacked, inventiveness withered inside Apple. The company that had led the industry with color on the Apple II, a graphical user interface with the Macintosh, desk-top publishing and laser printing, integrated networks, and stereo sound stopped leading. As Sculley departed, amidst a flurry of recrimination prompted by his affection for the limelight and dalliance with the national stage, the cupboard was bare. The spark of imagination, or, more particularly, the ability to transform a promising idea into an appealing product, had been extinguished. Apple introduced no meaningful new products in the decade Sculley spent at the helm. The computers that did appear bore sterile names such as Performa, Centris and Quadra. Computers with more memory, larger screens and bigger disk drives do not count for lifetime achievement awards. The Newton, a small, digital organizer championed by Sculley in his self-appointed role as Apple’s Chief Technology Officer, amounted to little more than an expensive doorstop. In an autobiography, published in 1987, Sculley—in what now seems like a very accurate assessment about the gulf between his capabilities and the Founder he displaced—savaged Jobs’ ideas of the future by writing, “Apple was supposed to become a wonderful consumer products company. This was a lunatic plan. High tech could not be designed and sold as a consumer product.”
When Sculley was fired, Apple was in peril. Windows 3.0, introduced by Microsoft in 1990, was not as elegant as the Macintosh software, but it was good enough. As Sculley returned to the East Coast, Apple’s market share had eroded, its margins had collapsed; the best young engineers were inclined to apply for openings at companies such as Microsoft, Silicon Graphics or Sun Microsystems.

Other books

Watched by Warlocks by Hannah Heat
Pure Dead Magic by Debi Gliori
About the Dark by helenrena
The Haunting of Brier Rose by Simpson, Patricia
Blood Spirits by Sherwood Smith
Broken Wings by Melanie Nilles