www.aec.at  
Ars Electronica 1996
Festival-Website 1996
Back to:
Festival 1979-2007
 

 

The Transformation of Memory to the Desired Form


'Douglas Back Douglas Back

The world is a big, puzzling place. To be puzzled by something is generally considered a sign of intellectual inferiority. We have to construct filters to screen most of it out. If you can’t explain something, then it’s not credible. We exclude until we can form answers. Questions are for spiritualists, capable people provide answers on demand. The more confusing the world gets the narrower the filter becomes.

We are mistaken in believing that these filters, call them, memes, fads, computers, theories, art, lifestyles, etc. encompass reality. They are very exclusive. We are constantly in danger of only seeing those things which get through our filters and missing everything else. Art serves an important function in opening filters. Unlike a craftsman or programmer an artist makes their materials do things not suited to their nature. The first person to look at a cold, hard, brittle rock and think, "I want to make this look like warm, soft, pliable flesh", had an important meme. If you want to make art with computers, you should learn enough about them to make them do things they aren’t suited to.

If there is a meme which I am sick of, it’s the one which makes people believe that "this new technology will totally transform the way you live, work and love".

Yes, the Internet is new. The Internet will change our world. The Internet will change our relationships with our bodies. But using the "Internet Is All" filter already has the technological positivists skipping over important developments, like the paperless office, and demanding the paperless bathroom.

I’m old enough to remember when Xerox machines were supposed to change … the standards of privacy, tumble information hierarchies, propagate masses of pornography, cause a horrible snarl in the copyrights courts, make everyone their own publisher, give artists a venue outside of the corporate structure, and they did … sort of. But look at Xerox machines now. They sit in your home; they sit at the corner store and they are boring. Soon the Internet will sit in a corner and be boring. But don’t worry, something will quickly replace the Internet and we won’t need to live in the mundane present.

Art has been a successful meme [or is it a carrier?] for 40,000 years at least. To me, it still isn’t boring. I’m going to take the stance that memes are accepted by free will rather than being invasive, that memes are filters that open up new channels into the bandwidth when something new pops up in the environment. To examine something it must be allowed into memory. Putting your baseball cap on backwards alters preception. Memes, like genes, are just another way of dealing with puzzles.

As an artist and a programmer, memory is my material. I am most interested in what memory means to memetics.

First I propose that the site of art and programming is in memory. Memory requires some sort of container, some sort of interface to an external world and some method of retrieval and transformation.

Humans have devoted a lot of evolutionary bandwidth to language. Our manipulation of memory is directed by language. Language was the first virtual reality. It occurred when we broke the names of things away from the objects themselves and set them free to float around in a separate world. Where you could combine them in surreal, abstract, unnatural expressions. It gave us the ability to see a bull’s head as the female reproductive system or a tool inside a rock. This type of memory transformation has to do with a kind of morphing or overlay. A cut and paste memory transformation allows us to catalog and impose order even where none exists. Cut and paste manipulation is systematic and serial. These two techniques define the two educated cultures in my society, science and arts.

Programming transforms memory.
Art works transform memory.
Memes transform memory.
Memory is where art exists, programs exists, where memes exist, where sex exists, etc.

The externalization of memory, as [I think] Robert Adrian points out, is a stage in our evolution. Memes, programs and art require memory to survive, whether it is within human memory or electronic. The pivotal point is that memes cannot grow unless they have a method of modifying this memory somehow.

If the hundreds of tera bytes of memory available on the Internet is a soup in which memes spontaneously generate, then memes must have some method to manipulate it. I have a lot of problems with memes in computers. I think memes have a better chance of surviving in human memory.

I am confused as to whether memes have physical bodies or not. In one sense I disagree with Hans-Cees Speel when he says, "Like viruses, memes do not build organisms, or are alive in the sense that they have metabolism …"

This year, due to budgetary constraints, my job description included "Head of the Computer Lab For the Art Division" of the college where I teach. This is something I’m not especially suited to. An influx of government money [for equipment] allowed for additional computers and software. We had formerly been using out-of-date equipment that only allowed for the teaching of cheap programming languages. These were taught in conjunction with electronics to build interactive sculpture.

The new influx of money allowed for machines powerful enough to teach 3D rendering and MultiMedia courses. This also meant that I was forced to spend several thousand dollars on upgrades to get a handle on the software/hardware to teach.

Part of my year end report follows: "3D Studio and Macro Mind Director will suck up all the available funding each year, in their single-minded drive to evolve themselves. The computer lab is starting to resemble a sea bed, littered with abandoned computers as these programs shuck off the old protective shells that they have out-grown and their clamouring voices become increasingly more strident as their ability to possess the minds of the student population grows with each new release.

Each year we will need a large influx of monies for faster machines and more upgrades to keep these programs from devouring us." It is hard to blame the students; a prime tactic in marketing computers is Darwinian based paranoia, the fear of obsolescence, losing the cutting edge, becoming "unfit", holding dead skills. On the other hand I don’t feel memes could ever spontaneously spring to life in the body of the computer. Without trying to sound like a lecture from an Introduction To Computer Programming Class I would like to clarify what computers are capable of.

Computer programs are complex because they reduce processes that people think of as one step in itself – adding numbers, writing a poem, moving an image on the screen – into a long and complex series of incremental steps.

All computers have large, crudely formed copper tunnels in them that we hurl huge masses of barely controlled electrons back and forth in [really one electron should be enough]. A number of tunnels arranged beside each other are soldered to some of the legs on a microprocessor.

Every time you send a pattern of electrons roaring down these tunnels to smash into the transistors in the microprocessor it reacts by performing a specific "something". Much like the rubber hammer your doctor uses to test your reflexes, the transistors kick over a particular line of dominoes that performs an operation. All to the beat of a master clock.

On an IBM type microprocessor there are eight tunnels leading to the microprocessor. Sending the bit pattern 00010110:
no electrons down tunnel #1
no electrons down tunnel #2
no electrons down tunnel #3
electrons down tunnel #4
no electrons down tunnel #5
electrons down tunnel #6
electrons down tunnel #7
no electrons down tunnel #8
This forms an instruction; it is language; in this case it makes the computer subtract. The instruction is embedded into the transistors of the chip itself. It must subtract each and every time it gets smacked in this pattern or no one would buy it.

This is the ONLY language a computer understands. Instructions like these define the ONLY things a computer can do and these few instructions define EVERYTHING that computers can do. Computers are knee jerk-devices. The computer spasms to this language [instruction set]. In a IBM family computer, there are only 118 words in its vocabulary. Ultimately everything the computer does uses only these 118 words, and about 50 of these words are rarely used.

The newer Power PC’s etc. are based on RISC technology, the base instruction set is reduced, because apparently, reduced choice means faster processing. Programming in C++, Java, etc. adds nothing to this basic set, in fact they all remove you from some of the capabilities of the computer. High-level languages, such as those I have referred to, have many more words in them, but their function is to translate their more "English" vocabulary into long sentences of the 118 words. Out of this very limited vocabulary we get very complex, very limited systems. Much like how we explain the world to ourselves.

I know this because I invited a very powerful meme which I allowed to make me want a computer in the late 70’s. I built a computer with 8 switches on it and programmed the computer by setting up patterns [instructions] on the switches and feeding them into memory one at a time. Perhaps because I did this I find nothing mysterious in the workings of computers. The possibility of a alien life form springing out of this reduced set of instructions is very unlikely.

Even though the binary code, 1 and 0, can be analogously equated to the genetic code, CGAT, the "instruction set" of genetics is much, much larger than that of any computer. It will take many more years of concerted work by the artificial life programmers to decide if artificial life is possible. These are people that I have come to admire because they ask the question: what is life?

Perhaps we wonder if there is life inside our computers because no one person understands any longer how our computers or programs work. Probably the last computer that could be understood by one person was the Apple II. I felt an intimacy with this machine as if I was collaborating with one other person, which I don’t on any new machine. Now machines are a schizophrenic babble of voices and languages; I enjoy this also.

Computers and software are now built by teams, the teams work on individual components and then get "glued" together when marketing decides that a product must be forthcoming. The gluing process is always messy and time-consuming and done at the last minute. Programs and logic circuits contain much of the personality of the person that made them. In programming and in circuit design the most "logical" and methodical approach to an end is never the "right way". People creating these things have developed very personal tricks that tease the maximum speed and power from their materials and often utilize mistakes made by previous engineers. An unorthodox trick that skips one or two machine instructions is considered beautiful.

Even at the lowest level, hardware and instructions allow for choice. In our subtraction example you have to explain what to subtract from what, and the computer needs to know what to do with the result. The first question is handled by variations of the subtraction instruction, of which there are about seven [depending on what you regard as subtraction because each particular instruction tells the microprocessor to take numbers from memory and compute the difference in slightly different manners].

The numbers to be worked on can be located in three different places – in one of the microprocessor’s registers, in ordinary RAM memory, or in the code of the instruction itself. The result is always stored in a register. If the information to be worked on is on disk it must first be transferred to RAM. Other instructions tell the chip to put numbers in its registers to be worked on later and move information from a register to somewhere else, for example to the memory or to memory outside the computer, the screen, the Internet, etc. through an output port. The example instruction tells the microprocessor to subtract an immediate number from the accumulator, a particular microprocessor register that favours calculations.

Everything the microprocessor does consists of nothing more than a series of one-step-at-a-time instructions. Simple subtraction of two numbers may entail dozens of steps, and dozens of methods. The order and method are entirely up to the personal choice of the programmer. These personal choices are directed by the personal prejudices of the author.

Most people write in high-level languages, so the personal prejudices of the people that wrote the code that interprets your code is filled with prejudice. The high-level language that you use to write code was written by teams of people. The teams work in their own discrete pockets of prejudice until their code gets "glued" together with the other code when marketing decides that a viable product must be forthcoming. Application software is written by teams of people using these high-level languages.

Computers are made in a similar fashion by teams of hardware people. There are as many different approaches to using a logic chip as there are to writing code. Add to this mix a huge operating system and you have a virtual matreschka,"[Reader’s note: I cannot find nor understand this word – could one check with the author? With each layer interacting with the adjacent layers in untraceable and incomprehensible ways. Even our most advanced nerds know only how they don’t work.

As an experiment Carl Hamfelt and I attempted to cut and paste an art work from off-the-shelf, off-the-net materials. We purchased a PC amateur weather station and its "C" language source code and downloaded a "C" shareware communication package and purchased an industrial machine control board with source code. The weather data was gathered at my home and sent via modem over to the Kunstlerhaus in Graz, where a commercial data acquisition and control board, with the packaged "C" language software, was to drive a servomotor and a large powerful fan to duplicate the weather conditions at my home. What should have been a bit of code cut and paste became a series of gut, hack and start-from-scratch programming sessions into the wee hours of the morning, up to hours before the opening. The memes provided by the industry were meant for the industry, and not for people pushing beyond industrial limits.

There were at least three voices in the code that were not available for comment. We had to get these mute workers to work together. The difficulty in morphing three discrete thought processes into a working whole was very difficult. It wasn’t that the code wouldn’t mesh together but more a problem in uniting three different philosophies.

This is much like the other memory system I work with. Again a cut and paste memory manipulation won’t work, as an artist I use the morphing technique.

The great thing about the general public is that they all have brains. Their brains are wired to seek patterns in the world, compare them to old patterns in memory and make sense of them. If there is something new in environment that memory and language can’t map onto they become receptive to new patterns that may explain it. We become irritated by things which can’t be explained. Language seems to have over written our other pattern recognition processor, call it what you want, the unconscious, the spirit world, the brain of a preverbal child, dark Socratic demons or memes, our visceral language either always was or has become "fuzzy". To successfully make art you must recognize an irritant in the environment and supply some sort of explanation. Usually we rely on language but when something new enters the environment we resort to visceralization.

It doesn’t matter which media you use. A poor musician thinks he is playing an instrument, a fair musician plays the ear drum and a good musician manipulates memory in the brain. But why bother with any intermediary media? Why not just tell people what it is you have to say? Surely that would be more cost-effective.

My latest theory is that artists have some kind of partial link to that part of the brain which has no language. This part wishes to communicate, but most communications channels are wired for words. To complete the communications circuit the illiterate brain must direct the artist to fabricate some sort of model in the real world. The communication comes out through the hands and once it is in front of us we understand … a little more.

This is a very slow process. We can only hope that the model makes sense to the mute brain parts in our audience. Eventually critics and theorists work up the language to gut the meme and render it literal. Perhaps the role of Post-Modernism is to by-pass this stage, or perhaps state funding with its demand that artists must justify their work in words, has put the "cut and paste" pressure on the arts and sciences. If it is not explainable in text then it is flaky and self-centred.