www.aec.at  
 
 
 

Back to:
last page

Prix2000
Prix 1987 - 2007

 
 
Organiser:
ORF Oberösterreich
 


GOLDEN NICA
In the Beginning was the Command Line (Excerpts)
Neal Stephenson


Computers do arithmetic on bits of information. Humans construe the bits as meaningful symbols. But this distinction is now being blurred, or at least complicated, by the advent of modern operating systems that use, and frequently abuse, the power of metaphor to make computers accessible to a larger audience ...
People who have only interacted with computers through graphical user interfaces like the Mac OS or Windows—which is to say, almost everyone who has ever used a computer—may have been startled, or at least bemused, to hear about the telegraph machine that I used to communicate with a computer in 1973. But there was, and is, a good reason for using this particular kind of technology. Human beings have various ways of communicating to each other, such as music, art, dance, and facial expressions, but some of these are more amenable than others to being expressed as strings of symbols. Written language is the easiest of all, because, of course, it consists of strings of symbols to begin with. If the symbols happen to belong to a phonetic alphabet (as opposed to, say, ideograms), converting them into bits is a trivial procedure, and one that was nailed, technologically, in the early nineteenth century, with the introduction of Morse code and other forms of telegraphy.
We had a human/computer interface a hundred years before we had computers. When computers came into being around the time of the Second World War, humans, quite naturally, communicated with them by simply grafting them on to the already-existing technologies for translating letters into bits and vice versa: teletypes and punch card machines.When Ronald Reagan was a radio announcer, he used to call baseball games by reading the terse descriptions that trickled in over the telegraph wire and were printed out on a paper tape. He would sit there, all by himself in a padded room with a microphone, and the paper tape would eke out of the machine and crawl over the palm of his hand printed with cryptic abbreviations. If the count went to three and two, Reagan would describe the scene as he saw it in his mind’s eye: “The brawny left-hander steps out of the batter’s box to wipe the sweat from his brow. The umpire steps forward to sweep the dirt from home plate.” and so on. When the cryptogram on the paper tape announced a base hit, he would whack the edge of the table with a pencil, creating a little sound effect, and describe the arc of the ball as if he could actually see it. His listeners, many of whom presumably thought that Reagan was actually at the ballpark watching the game, would reconstruct the scene in their minds according to his descriptions.
This is exactly how the World Wide Web works: the HTML files are the pithy description on the paper tape, and your Web browser is Ronald Reagan. The same is true of Graphical User Interfaces in general.
So an OS is a stack of metaphors and abstractions that stands between you and the telegrams, and embodying various tricks the programmer used to convert the information you’re working with—be it images, e-mail messages, movies, or word processing documents—into the necklaces of bytes that are the only things computers know how to work with. When we used actual telegraph equipment (teletypes) or their higher-tech substitutes (“glass teletypes,” or the MS-DOS command line) to work with our computers, we were very close to the bottom of that stack. When we use most modern operating systems, though, our interaction with the machine is heavily mediated. Everything we do is interpreted and translated time and again as it works its way down through all of the metaphors and abstractions.Hostility towards Microsoft is not difficult to find on the Net, and it blends two strains: resentful people who feel Microsoft is too powerful, and disdainful people who think it’s tacky. This is all strongly reminiscent of the heyday of Communism and Socialism, when the bourgeoisie were hated from both ends: by the proles, because they had all the money, and by the intelligentsia, because of their tendency to spend it on lawn ornaments. Microsoft is the very embodiment of modern high-tech prosperity—it is, in a word, bourgeois—and so it attracts all of the same gripes.
People who are inclined to feel poor and oppressed construe everything Microsoft does as some sinister Orwellian plot. People who like to think of themselves as intelligent and informed technology users are driven crazy by the clunkiness of Windows, the exposed rivets and the leaky gaskets.It is a bit unsettling, at first, to think of Apple as a control freak, because it is completely at odds with their corporate image. Weren’t these the guys who aired the famous Super Bowl ads showing suited, blindfolded executives marching like lemmings off a cliff? Isn’t this the company that even now runs ads picturing the Dalai Lama (except in Hong Kong) and Einstein and other offbeat rebels?
It is indeed the same company, and the fact that they have been able to plant this image of themselves as creative and rebellious free-thinkers in the minds of so many intelligent and media-hardened skeptics really gives one pause. It is testimony to the insidious power of expensive slick ad campaigns and, perhaps, to a certain amount of wishful thinking in the minds of people who fall for them. It also raises the question of why Microsoft is so bad at PR, when the history of Apple demonstrates that, by writing large checks to good ad agencies, you can plant a corporate image in the minds of intelligent people that is completely at odds with reality. (The answer, for people who don’t like Damoclean questions, is that since Microsoft has won the hearts and minds of the silent majority—the bourgeoisie—they don’t give a damn about having a slick image, any more then Dick Nixon did. “I want to believe,’’—the mantra that Fox Mulder has pinned to his office wall in The X-Files—applies in different ways to these two companies; Mac partisans want to believe in the image of Apple purveyed in those Super Bowl ads, and in the notion that Macs are somehow fundamentally different from other computers, while Windows people want to believe that they are getting something for their money, engaging in a respectable business transaction).A few years ago I walked into a grocery store somewhere and was presented with the following tableau vivant: near the entrance a young couple were standing in front of a large cosmetics display. The man was stolidly holding a shopping basket between his hands while his mate yanked blister-packs of makeup off the display and threw them in. Since then I’ve always thought of that man as the personification of an interesting human tendency: not only are we not offended to be dazzled by manufactured images, but we like it. We practically insist on it. We are eager to be complicit in our own dazzlement: to pay money for a theme park ride, vote for a guy who’s obviously lying to us, or stand there holding the basket as it’s filled up with cosmetics.
I was in Disney World recently, specifically the part of it called the Magic Kingdom, walking up Main Street USA. This is a perfect gingerbready Victorian small town that culminates, bizarrely, in a Disney castle. It was very crowded; we shuffled rather than walked. Directly in front of me was a man with a camcorder. It was one of the new breed of camcorders where instead of peering through a viewfinder you gaze at a flat-panel color screen about the size of a playing card, which televises live coverage of whatever the camcorder is seeing. He was holding the appliance close to his face, so that it obstructed his view. Rather than go see a real small town for free, he had paid money to see a pretend one, and rather than see it with the naked eye he was watching it on television.
And rather than stay home and read a book, I was watching him.
If you are an intellectual type, a reader or writer of books, the nicest thing you can say about Disney World is that the execution is superb. But it’s easy to find the whole environment a little creepy, because something is missing: the translation of all its content into clear explicit written words, the attribution of the ideas to specific people. It seems as if a hell of a lot might be being glossed over, as if Disney World might be putting one over on us, and possibly getting away with all kinds of buried assumptions and muddled thinking.
But this is precisely the same as what is lost in the transition from the command-line interface to the GUI. Disney and Apple/Microsoft are in the same business: short-circuiting laborious, explicit verbal communication with expensively designed interfaces. Disney is a sort of user interface unto itself—and more than just graphical. Let’s call it a Sensorial Interface. It can be applied to anything in the world, real or imagined, albeit at staggering expense.
What is the source of our culture’s rejection of explicit word-based interfaces, and the embrace of graphical or sensorial interfaces—a trend that accounts for the success of both Apple/Microsoft and Disney?
Part of it is simply that the world is very complicated now—much more complicated than the hunter-gatherer world that our brains evolved to cope with—and we simply can’t handle all of the details. We have to delegate. We have no choice but to trust some nameless artist at Disney or programmer at Apple/Microsoft to make a few choices for us, close off some options, and give us a conveniently packaged executive summary.
But more importantly, it comes out of the fact that, during this century, intellectualism failed, and everyone knows it. We agreed to let go of traditional folkways, mores, and religion, and let the intellectuals run with the ball, and they screwed everything up and turned the century into an abbatoir.
We Americans are the only ones who didn’t get creamed at some point during all of this. We are free and prosperous because we have inherited political and values systems fabricated by a particular set of eighteenth-century intellectuals who happened to get it right. But we have lost touch with those intellectuals, and with anything like intellectualism, even to the point of not reading books any more, though we are literate. We seem much more comfortable with propagating those values to future generations nonverbally, through a process of being steeped in media. Apparently this actually works to some degree, for police in many lands are now complaining that local arrestees are insisting on having their Miranda rights read to them as if they were perps in American TV cop shows. Starsky and Hutch reruns may turn out, in the long run, to be a greater force for human rights than the Declaration of Independence.
A huge, rich, nuclear-tipped culture that propagates its core values through media steepage seems like a bad idea. There is an obvious risk of running astray here. Words are the only immutable medium we have, which is why they are the vehicle of choice for extremely important concepts like the Ten Commandments, the Koran, and the Bill of Rights. Unless the messages conveyed by our media are somehow pegged to a fixed, written set of precepts, they can wander all over the place and possibly dump loads of crap into people’s minds.
Orlando used to have a military installation called McCoy Air Force Base, with long runways from which B-52s could take off and reach Cuba, or just about anywhere else, with loads of nukes. But now McCoy has been scrapped and repurposed. It has been absorbed into Orlando’s civilian airport. The long runways are being used to land 747-loads of tourists from Brazil, Italy, Russia and Japan, so that they can come to Disney World and steep in our media for a while.
To traditional cultures, especially word-based ones such as Islam, this is infinitely more threatening than the B-52s ever were. It is obvious, to everyone outside of the United States, that our arch-buzzwords, multiculturalism and diversity, are false fronts that are being used (in many cases unwittingly) to conceal a global trend to eradicate cultural differences. The basic tenet of multiculturalism (or “honoring diversity” or whatever you want to call it) is that people need to stop judging each other—to stop asserting (and, eventually, to stop believing) that this is right and that is wrong, this true and that false, one thing ugly and another thing beautiful, that God exists and has this or that set of qualities.
The lesson of the Twentieth Century is that, in order for a large number of different cultures to coexist peacefully on the globe (or even in a neighborhood) it is necessary for people to suspend judgment in this way. Hence (I would argue) our suspicion of, and hostility towards, all authority figures in modern culture. As David Foster Wallace has explained in his essay
“E Unibus Pluram,” this is the basic message of television; it is the message that people take home, anyway, after they have steeped in our media long enough. It’s not expressed in these highfalutin terms, of course. It comes through as the presumption that all authority figures—teachers, generals, cops, ministers, politicians—are hypocritical buffoons, and that hip jaded coolness is the only way to be.
The problem is that once you have done away with the ability to make judgments as to right and wrong, true and false, etc., there’s no real culture left. All that remains is clog dancing and macrame. The ability to make judgments, to believe things, is the entire point of having a culture.

By using GUIs all the time we have insensibly bought into a premise that few people would have accepted if presented to them bluntly: namely, that hard things can be made easy, and complicated things simple, by putting the right interface on them.What made old epics like Gilgamesh so powerful and so long-lived was that they were living bodies of narrative that many people knew by heart, and told over and over again—making their own personal embellishments whenever it struck their fancy. The bad embellishments were shouted down, the good ones picked up by others, polished, improved, and, over time, incorporated into the story. Likewise, Unix is known, loved, and understood by so many hackers that it can be re-created from scratch whenever someone needs it. This is very difficult to understand for people who are accustomed to thinking of OSes as things that absolutely have to be bought.
But many hackers have launched more or less successful re-implementations of the Unix ideal. Each one brings in new embellishments. Some of them die out quickly, some are merged with similar, parallel innovations created by different hackers attacking the same problem, others still are embraced, and adopted into the epic. Thus Unix has slowly accreted around a simple kernel and acquired a kind of complexity and asymmetry about it that is organic, like the roots of a tree, or the branchings of a coronary artery. Understanding it is more like anatomy than physics.
Credit for Linux generally goes to its human namesake, one Linus Torvalds, a Finn who got the whole thing rolling when he used some of the GNU tools to write the beginnings of a Unix kernel that could run on PC-compatible hardware. And indeed Torvalds deserves all the credit he has ever gotten, and a whole lot more. But he could not have made it happen by himself, any more than Richard Stallman could have. To write code at all, Torvalds had to have cheap but powerful development tools, and these he got from Stallman’s GNU project.
And he had to have cheap hardware on which to write that code. Cheap hardware is a much harder thing to arrange than cheap software; a single person (Stallman) can write software and put it up on the Net for free, but in order to make hardware it’s necessary to have a whole industrial infrastructure, which is not cheap by any stretch of the imagination. Really the only way to make hardware cheap is to punch out an incredible number of copies of it, so that the unit cost eventually drops. For reasons already explained, Apple had no desire to see the cost of hardware drop. The only reason Torvalds had cheap hardware was Microsoft.
Microsoft refused to go into the hardware business, insisted on making its software run on hardware that anyone could build, and thereby created the market conditions that allowed hardware prices to plummet. In trying to understand the Linux phenomenon, then, we have to look not to a single innovator but to a sort of bizarre Trinity: Linus Torvalds, Richard Stallman, and Bill Gates. Take away any of these three and Linux would not exist.


Excerpts from: Neal Stephenson, In the Beginning … Was the Command Line, Avon Books 1999.

###


http://www.well.com/user/neal/

In a postscript to his pathbreaking 1992 science fiction novel Snow Crash author Neal Stephenson writes about how he settled reluctantly upon writing a traditional novel when he discovered that the current interactive CD technology was not yet powerful enough to support his vision of a new kind of cyber universe.
In the text-based novel that resulted, he namend this computer and network generator world the "Metaverse". The idea gave William Gibson´s original notion of cyberspace a geography and although Stephenson had given up on his original plan to create an interactive computer novel, many of his readers were inspired by his vision.
The idea that there is a fabric to cyberspace and that it is somehow more than the sum of the machines that are interconnected by IP adresses is very much a reality today. VRML is a well established protocol on the web, and it has recently been complemented by a wide range of other three-dimensional navigational protocols.
While many writers arrive at a single vision and remain stuck there, Stephenson has continued to re-invent himself, exploring succession of technologies and their impact on the world.