In the Beginning...Was the Command Line (5 page)

BOOK: In the Beginning...Was the Command Line
12.67Mb size Format: txt, pdf, ePub

The lesson most people are taking home from the twentieth century is that, in order for a large number of different cultures to coexist peacefully on the globe (or even in a neighborhood) it is necessary for people to suspend judgment in this way. Hence (I would argue) our suspicion of, and hostility toward, all authority figures in modern culture. As David Foster Wallace has explained in his essay “E Unibus Pluram,” this is the fundamental message of television; it is the message that people absorb, anyway, after they have steeped in our media long enough. It’s not expressed in these highfalu
tin terms, of course. It comes through as the presumption that all authority figures—teachers, generals, cops, ministers, politicians—are hypocritical buffoons, and that hip jaded coolness is the only way to be.

The problem is that once you have done away with the ability to make judgments as to right and wrong, true and false, etc., there’s no real culture left. All that remains is clog dancing and macrame. The ability to make judgments, to believe things, is the entire point of having a culture. I think this is why guys with machine guns sometimes pop up in places like Luxor and begin pumping bullets into Westerners. They perfectly understand the lesson of McCoy Air Force Base. When their sons come home wearing Chicago Bulls caps with the bills turned sideways, the dads go out of their minds.

The global anticulture that has been conveyed into every cranny of the world by television is a culture unto itself, and by the standards of great and ancient cultures like Islam and France, it seems grossly inferior, at least at first. The only good thing you can say about it is that it makes world wars and Holocausts less likely—and that is actually a pretty good thing!

The only real problem is that anyone who has no culture, other than this global monoculture, is completely screwed. Anyone who grows up watching TV, never sees any religion or philosophy, is raised in an atmosphere of moral relativism, learns about civics from watching bimbo eruptions on network TV news, and attends a university where postmodernists vie to outdo each other in demolishing traditional notions of truth and quality,
is going to come out into the world as one pretty feckless human being. And—again—perhaps the goal of all this is to make us feckless so we won’t nuke each other.

On the other hand, if you are raised within some specific culture, you end up with a basic set of tools that you can use to think about and understand the world. You might use those tools to reject the culture you were raised in, but at least you’ve got some tools.

In this country, the people who run things—who populate major law firms and corporate boards—understand all of this at some level. They pay lip service to multiculturalism and diversity and nonjudgmentalness, but they don’t raise their own children that way. I have highly educated, technically sophisticated friends who have moved to small towns in Iowa to live and raise their children, and there are Hasidic Jewish enclaves in New York where large numbers of kids are being brought up according to traditional beliefs. Any suburban community might be thought of as a place where people who hold certain (mostly implicit) beliefs go to live among others who think the same way.

And not only do these people feel some responsibility to their own children, but to the country as a whole. Some of the upper class are vile and cynical, of course, but many spend at least part of their time fretting about what direction the country is going in and what responsibilities they have. And so issues that are important to book-reading intellectuals, such as global environmental collapse, eventually percolate through the porous buffer of mass culture and show up as ancient Hindu ruins in Orlando.

You may be asking: what the hell does all this have to do with operating systems? As I’ve explained, there is no way to explain the domination of the OS market by Apple/Microsoft without looking to cultural explanations, and so I can’t get anywhere, in this essay, without first letting you know where I’m coming from vis-à-vis contemporary culture.

Contemporary culture is a two-tiered system, like the Morlocks and the Eloi in H. G. Wells’s
The Time Machine
, except that it’s been turned upside down. In
The Time Machine
, the Eloi were an effete upper class, supported by lots of subterranean Morlocks who kept the technological wheels turning. But in our world it’s the other way round. The Morlocks are in the minority, and they are running the show, because they understand how everything works. The much more numerous Eloi learn everything they know from being steeped from birth in electronic media directed and controlled by book-reading Morlocks. That many ignorant people could be dangerous if they got pointed in the wrong direction, and so we’ve evolved a popular culture that is (a) almost unbelievably infectious, and (b) neuters every person who gets infected by it, by rendering them unwilling to make judgments and incapable of taking stands.

Morlocks, who have the energy and intelligence to comprehend details, go out and master complex subjects and produce Disney-like Sensorial Interfaces so that Eloi can get the gist without having to strain their minds or endure boredom. Those Morlocks will go to India and tediously explore a hundred ruins, then come home and build
sanitary bug-free versions: highlight films, as it were. This costs a lot, because Morlocks insist on good coffee and first-class airline tickets, but that’s no problem, because Eloi like to be dazzled and will gladly pay for it all.

Now I realize that most of this probably sounds snide and bitter to the point of absurdity: your basic snotty intellectual throwing a tantrum about those unlettered philistines. As if I were a self-styled Moses, coming down from the mountain all alone, carrying the stone tablets bearing the Ten Commandments carved in immutable stone—the original command line interface—and blowing his stack at the weak, unenlightened Hebrews worshipping images. Not only that, but it sounds like I’m pumping some sort of conspiracy theory.

But that is not where I’m going with this. The situation I describe here could be bad, but doesn’t have to be bad and isn’t necessarily bad now.

It simply is the case that we are way too busy, nowadays, to comprehend everything in detail. And it’s better to comprehend it dimly, through an interface, than not at all. Better for ten million Eloi to go on the Kilimanjaro Safari at Disney World than for a thousand cardiovascular surgeons and mutual fund managers to go on “real” ones in Kenya. The boundary between these two classes is more porous than I’ve made it sound. I’m always running into regular dudes—construction workers, auto mechanics, taxi drivers, galoots in general—who were largely aliterate until something made it necessary for them to become readers and start actually thinking about things. Perhaps they had to come to grips with
alcoholism, perhaps they got sent to jail, or came down with a disease, or suffered a crisis in religious faith, or simply got bored. Such people can get up to speed on particular subjects quite rapidly. Sometimes their lack of a broad education makes them overapt to go off on intellectual wild-goose chases, but hey, at least a wild-goose chase gives you some exercise. The spectre of a polity controlled by the fads and whims of voters who actually believe that there are significant differences between Bud Lite and Miller Lite, and who think that professional wrestling is for real, is naturally alarming to people who don’t. But then countries controlled via the command line interface, as it were, by double-domed intellectuals, be they religious or secular, are generally miserable places to live.

Sophisticated people deride Disneyesque entertainments as pat and saccharine, but if the result of that is to instill basically warm and sympathetic reflexes, at a preverbal level, into hundreds of millions of unlettered media-steepers, then how bad can it be? We killed a lobster in our kitchen last night and my daughter cried for an hour. The Japanese, who used to be just about the fiercest people on earth, have become infatuated with cuddly, adorable cartoon characters. My own family—the people I know best—is divided about evenly between people who will probably read this essay and people who almost certainly won’t, and I can’t say for sure that one group is necessarily warmer, happier, or better-adjusted than the other.

Back in the days of the command line interface, users were all Morlocks who had to convert their thoughts into alphanumeric symbols and type them in, a grindingly tedious process that stripped away all ambiguity, laid bare all hidden assumptions, and cruelly punished laziness and imprecision. Then the interface-makers went to work on their GUIs and introduced a new semiotic layer between people and machines. People who use such systems have abdicated the responsibility, and surrendered the power, of sending bits directly to the chip that’s doing the arithmetic, and handed that responsibility and power over to the OS. This is tempting, because giving clear instructions, to anyone or anything, is difficult. We cannot do it without thinking, and depending on the complexity of the situation, we may have to think hard about abstract things, and consider any number of ramifications, in order to do a good job of it. For most of us, this is hard work. We want things to be easier. How badly we want it can be measured by the size of Bill Gates’s fortune.

The OS has (therefore) become a sort of intellectual labor-saving device that tries to translate humans’ vaguely expressed intentions into bits. In effect we are asking our computers to shoulder responsibilities that have always been considered the province of human beings—we want them to understand our desires, to anticipate our needs, to foresee consequences, to make connections, to handle routine chores without being asked, to remind us of what we ought to be reminded of while filtering out noise.

At the upper (which is to say, closer to the user) levels, this is done through a set of conventions—menus, buttons, and so on. These work in the sense that analogies work: they help Eloi understand abstract or unfamiliar concepts by likening them to something known. But the loftier word “metaphor” is used.

The overarching concept of the MacOS was the “desktop metaphor,” and it subsumed any number of lesser (and frequently conflicting, or at least mixed) metaphors. Under a GUI, a file (frequently called “document”) is metaphrased as a window on the screen (which is called a “desktop”). The window is almost always too small to contain the document and so you “move around,” or, more pretentiously, “navigate” in the document by “clicking and dragging” the “thumb” on the “scroll bar.” When you “type” (using a keyboard) or “draw” (using a “mouse”) into the “window” or use pull-down “menus” and “dialog boxes” to manipulate its contents, the results of your labors get stored (at least in theory) in a “file,” and later you can pull the same information
back up into another “window.” When you don’t want it anymore, you “drag” it into the “trash.”

There is massively promiscuous metaphor-mixing going on here, and I could deconstruct it till the cows come home, but I won’t. Consider only one word: “document.” When we document something in the real world, we make fixed, permanent, immutable records of it. But computer documents are volatile, ephemeral constellations of data. Sometimes (as when you’ve just opened or saved them) the document as portrayed in the window is identical to what is stored, under the same name, in a file on the disk, but other times (as when you have made changes without saving them) it is completely different. In any case, every time you hit “Save” you annihilate the previous version of the “document” and replace it with whatever happens to be in the window at the moment. So even the word “save” is being used in a sense that is grotesquely misleading—“destroy one version, save another” would be more accurate.

Anyone who uses a word processor for very long inevitably has the experience of putting hours of work into a long document and then losing it because the computer crashes or the power goes out. Until the moment that it disappears from the screen, the document seems every bit as solid and real as if it had been typed out in ink on paper. But in the next moment, without warning, it is completely and irretrievably gone, as if it had never existed. The user is left with a feeling of disorientation (to say nothing of annoyance) stemming from a kind of
metaphor shear—you realize that you’ve been living and thinking inside of a metaphor that is essentially bogus.

So GUIs use metaphors to make computing easier, but they are bad metaphors. Learning to use them is essentially a word game, a process of learning new definitions of words such as “window” and “document” and “save” that are different from, and in many cases almost diametrically opposed to, the old. Somewhat improbably, this has worked very well, at least from a commercial standpoint, which is to say that Apple/Microsoft have made a lot of money off of it. All of the other modern operating systems have learned that in order to be accepted by users they must conceal their underlying gutwork beneath the same sort of spackle. This has some advantages: if you know how to use one GUI operating system, you can probably work out how to use any other in a few minutes. Everything works a little differently, like European plumbing—but with some fiddling around, you can type a memo or surf the web.

Most people who shop for OSes (if they bother to shop at all) are comparing not the underlying functions but the superficial look and feel. The average buyer of an OS is not really paying for, and is not especially interested in, the low-level code that allocates memory or writes bytes onto the disk. What we’re really buying is a system of metaphors. And—much more important—what we’re buying into is the underlying assumption that metaphors are a good way to deal with the world.

Recently a lot of new hardware has become available that gives computers numerous interesting ways of af
fecting the real world: making paper spew out of printers, causing words to appear on screens thousands of miles away, shooting beams of radiation through cancer patients, creating realistic moving pictures of the
Titanic
. Windows is now used as an OS for cash registers and bank tellers’ terminals. My satellite TV system uses a sort of GUI to change channels and show program guides. Modern cellular telephones have a crude GUI built into a tiny LCD screen. Even Legos now have a GUI: you can buy a Lego set called Mindstorms that enables you to build little Lego robots and program them through a GUI on your computer. So we are now asking the GUI to do a lot more than serve as a glorified typewriter. Now we want it to become a generalized tool for dealing with reality. This has become a bonanza for companies that make a living out of bringing new technology to the mass market.

Obviously you cannot sell a complicated technological system to people without some sort of interface that enables them to use it. The internal combustion engine was a technological marvel in its day, but useless as a consumer good until a clutch, transmission, steering wheel, and throttle were connected to it. That odd collection of gizmos, which survives to this day in every car on the road, made up what we would today call a user interface. But if cars had been invented after Macintoshes, carmakers would not have bothered to gin up all of these arcane devices. We would have a computer screen instead of a dashboard, and a mouse (or at best a joystick)
instead of a steering wheel, and we’d shift gears by pulling down a menu:

 

PARK

 

REVERSE

 

NEUTRAL

 

3

2

1

 

Help…

 

A few lines of computer code can thus be made to substitute for any imaginable mechanical interface. The problem is that in many cases the substitute is a poor one. Driving a car through a GUI would be a miserable experience. Even if the GUI were perfectly bug-free, it would be incredibly dangerous, because menus and buttons simply can’t be as responsive as direct mechanical controls. My friend’s dad, the gentleman who was restoring the MGB, never would have bothered with it if it had been equipped with a GUI. It wouldn’t have been any fun.

The steering wheel and gearshift lever were invented during an era when the most complicated technology in most homes was a butter churn. Those early carmakers were simply lucky, in that they could dream up whatever
interface was best suited to the task of driving an automobile, and people would learn it. Likewise with the dial telephone and the AM radio. By the time of the Second World War, most people knew several interfaces: they could not only churn butter but also drive a car, dial a telephone, turn on a radio, summon flame from a cigarette lighter, and change a lightbulb.

But now every little thing—wristwatches, VCRs, stoves—is jammed with features, and every feature is useless without an interface. If you are like me, and like most other consumers, you have never used ninety percent of the available features on your microwave oven, VCR, or cell phone. You don’t even know that these features exist. The small benefit they might bring you is outweighed by the sheer hassle of having to learn about them. This has got to be a big problem for makers of consumer goods, because they can’t compete without offering features.

It’s no longer acceptable for engineers to invent a wholly novel user interface for every new product, as they did in the case of the automobile, partly because it’s too expensive and partly because ordinary people can only learn so much. If the VCR had been invented a hundred years ago, it would have come with a thumb-wheel to adjust the tracking and a gearshift to change between forward and reverse, and a big cast-iron handle to load or to eject the cassettes. It would have had a big analog clock on the front of it, and you would have set the time by moving the hands around on the dial. But because the VCR was invented when it was—during a
sort of awkward transitional period between the era of mechanical interfaces and GUIs—it just had a bunch of pushbuttons on the front, and in order to set the time you had to push the buttons in just the right way. This must have seemed reasonable enough to the engineers responsible for it, but to many users it was simply impossible. Thus the famous blinking 12:00 that appears on so many VCRs. Computer people call this “the blinking twelve problem.” When they talk about it, though, they usually aren’t talking about VCRs.

Modern VCRs usually have some kind of on-screen programming, which means that you can set the time and control other features through a sort of primitive GUI. GUIs have virtual pushbuttons too, of course, but they also have other types of virtual controls, like radio buttons, checkboxes, text entry boxes, dials, and scrollbars. Interfaces made out of these components seem to be a lot easier, for many people, than pushing those little buttons on the front of the machine, and so the blinking 12:00 itself is slowly disappearing from America’s living rooms. The blinking twelve
problem
has moved on to plague other technologies.

So the GUI has gone beyond being an interface to personal computers, and has become a sort of meta-interface that is pressed into service for every new piece of consumer technology. It is rarely an ideal fit, but having an ideal, or even a good, interface is no longer the priority; the important thing now is having some kind of interface that customers will actually use, so that manufacturers
can claim, with a straight face, that they are offering new features.

We want GUIs largely because they are convenient and because they are easy—or at least the GUI makes it seem that way. Of course, nothing is really easy and simple, and putting a nice interface on top of it does not change that fact. A car controlled through a GUI would be easier to drive than one controlled through pedals and steering wheel, but it would be incredibly dangerous.

By using GUIs all the time we have insensibly bought into a premise that few people would have accepted if it were presented to them bluntly: namely, that hard things can be made easy, and complicated things simple, by putting the right interface on them. In order to understand how bizarre this is, imagine that book reviews were written according to the same values system that we apply to user interfaces: “The writing in this book is marvelously simple-minded and glib; the author glosses over complicated subjects and employs facile generalizations in almost every sentence. Readers rarely have to think, and are spared all of the difficulty and tedium typically involved in reading old-fashioned books.” As long as we stick to simple operations like setting the clocks on our VCRs, this is not so bad. But as we try to do more ambitious things with our technologies, we inevitably run into the problem of:

Other books

Circles of Seven by Bryan Davis
Carrier (1999) by Clancy, Tom - Nf
Seduced in Shadow by Stephanie Julian
The Veiled Dragon by Denning, Troy
Harmony by Marjorie B. Kellogg
the Trail to Seven Pines (1972) by L'amour, Louis - Hopalong 02
Island of Bones by P.J. Parrish
Dragon's Lair by Sharon Kay Penman
Darwin Expedition by Diane Tullson