www.aec.at  
Ars Electronica 1994
Festival-Program 1994
Back to:
Festival 1979-2007
 

 

The Sound of One Hand


'Jaron Lanier Jaron Lanier

I started on the design of the Virtual World for The Sound of One Hand, and on learning how to play it, only a month before the performance, so I had to become completely immersed in the creative process.

I had originally thought the piece would be an elaborate VR 'demo', or explication, with clear visual cues for the music, easy-to-use interfaces, and lots of funny Rube Goldberg tricks. But as I worked on the World, a mood, or an Essence, started to emerge, and it was true to my emotional and spiritual experience at the time. This was unexpected and exciting, even if the content was not cheerful. So I went with a darker and more intuitive process instead of falling in line with the familiar computer culture of clarity and light humor. There have only been rare occasions when I felt I was programming in an intuitive way, and this was one.

Don't expect the instruments to be immediately understandable, or imagine that they are easy to play. They emerged from a creative process I cannot fully explain, and I had to learn to play them. I don't think the two esthetics I'm distinguishing must be mutually exclusive, but the intuitive side of the equation can't reliably be willed into action. A synthesis of clarity and mood will come by grace, when it comes.

The first instrument is called the Rhythm Gimbal. A gimbal is a common mechanical construction; a hierarchy of rotating joints. The Rhythm Gimbal resembles a gyroscope. When it is still it is completely silent. When I pick it up and move it, it begins to emit sound. Rings rubbing against each other create the sound. They also change color at contact. Once set in motion, the Rhythm Gimbal will slow down, but will take a long time to stop completely. If I give the Rhythm Gimbal a good spin as I release it, it emits an extra set of noises which are more tinkly, and which slow down as the instrument winds down. Thus, unless I am careful to release it without any spin, it will continue to make sounds when I'm not looking at it. The 'background' sound heard while I am playing the other instruments comes from the Rhythm Gimbal.

The primary (non-tinkly) Rhythm Gimbal sound is a combination of a choir, an orchestra and some other stuff. The harmony is generated by the momentum with which internal parts of the instrument hit each other after it has been released by the hand: each ring transmits spin to the ring outside it, creating a complex motion, like pendulums hung on pendulums. The rings have beads on them. When the beads collide, they change color, and also cause a change in harmony. You know those old attractions at amusement-park arcades, where you hammer a target on the ground with a giant mallet and see how high you can send a puck on a big vertical ruler? The internal collisions of the Rhythm Gimbal fling virtual pucks around the circle of fifths, and then up the harmonic series, in much the same way. A note is added to the harmony when the two types of puck reach it at almost the same time. All the harmony and rhythmic texture come out of this process.

But the Gimbal can't properly be described as an algo-rythmic music generator. For example, I don't think an explicit style of initialization could be used to find the right parameters to make it sing. There is a necessary element of intuitive performance in the weird harmonies of this curious instrument.

My hand movements generate every note of the piece, as they are transmitted through the virtual instrument: There are no predeterminded sequences or groupings of notes; the musical content is entirely improvised, with the exception of the timbral range of the instrument. This does not mean that I can make any arbitrary music, any more than l could with any other musical instruments. I can't get a specific chord out of the Rhythm Gimbal reliably. But I can get a feel out of a chord progression, because I can influence when chords change and how radical the change will be. This does not feel like less control to me, but rather like a different kind of control. The test of an instrument is not what it can do, but: can you become infinitely more sensitive to it as you explore and learn? A piano is like this. A good instrument has a depth that the body can learn and the mind cannot. I believe it is entirely impossible for the mind to invent such instruments.

Hidden mechanisms in Virtual Reality are just invisible objects. While I was developing this World, I would make the harmonic structure visible -it looks like a bunch of notes crawling on rings and up a pole. But, as a visual design decision, I made it mostly invisible for the performance. One part is still visible, though: a large blue ring with tuning forks on it. Each of the tuning forks has a T-shaped thing on the base and rings on the arms. These objects store the current legal tonic and chords for progressions; you can see them moving as the harmony changes.

The CyberXylo is a mallet instrument. Its notes are taken from the tuning forks on the blue ring, so it is always harmonious with the Rhythm Gimbal. The mallet retains angular momentum, with some friction, when it is released. Thus it is possible to set it spinning so that it will continue to hit the keys of the CyberXylo on its own for a while. The spin is of poor mathematical quality: it increments rotations instead of using quaternions. This creates wild, unnatural spinning patterns. With practice, enthusiastic spins of the mallet close to the keys can be a source of remarkable rhythms.

The Cybersax is the most ergonomically complex instrument. When the instrument is grabbed, it turns to gradually become held correctly by your hand and tries to avoid passing through fingers on the way. Once you are holding it, the positions of your virtual fingers continue to respond to your physical ones, but are adjusted and placed properly on the sax keys. This is an example of a "simulation of control" that is critical in the design of virtual hand tools, especially when force-feedback is not available.

Three musical registers – soprano, alto, and bass – are located along the main tube. Each register consists of a set of shiny sax-like keys. The notes played by the keys come from the current set of legal notes defined by the Rhythm Gimbal, so it will not clash with the other instruments. It is possible to slide between registers by jerking the hand toward a targeted register. The momentum of your slide helps determine which notes will be associated with the keys until you slide again (if, for example, you approach the soprano register from the alto with greater force you will choose a set of notes that are higher up in pitch). You can play freely without dropping the horn by mistake (this was a hard quality to achieve). In the upper register, it is possible to play two melodies at once by modulating with the thumb. The orientation of the horn in space controls the timbre, mix, and other properties of the sound. Other design elements include the obscene, wagging tail/ mouthpiece and the throbbing bell.

The Cybersax sound and geometric constructions were partly inspired by a bizarre bamboo saxophone I have that came from Thailand. It is jointed at the top, just like the Cybersax's tail. Computer music must use instruments built out of concepts of what music is. This is a drastic departure from the "dumb" instruments of the past. A piano doesn't know what a note is, it just vibrates when struck. A sensitivity to, and a sense of awe at, the mystery that surrounds life is at the heart of both science and art; instruments with mandatory concepts built in can dull this sensitivity by providing an apparently non-mysterious setting for activity. This can lead to "nerdy" or bland art. It is interesting to hide one's self behind a piano, as opposed to a computer, but only because a piano is made of resonant materials, not of concepts. In order for computer art, or music, to work you have to be extra-careful to put people and human contact at the center of attention.

I was delighted to discover that The Sound of One Hand created an unusual status relationship between the performer, the audience, and technology. The usual use of rare and expensive high technology in performance is to create a spectacle that elevates the status of the performer. The performer is made relatively invulnerable, while the audience is supposed to be awestruck. This is what rock concerts and the Persian Gulf War have in common. The Sound of One Hand creates quite a different situation. The audience watches me contort myself as I navigate the space and handle the virtual instruments, but I am wearing EyePhones. Five thousand people are watching me, but I can't see them, or know what I look like to them. I was vulnerable, despite the technology. This created a more authentic setting for music. If you have played music, especially improvised music, in front of an audience, you know the kind of vulnerability I am talking about, the vulnerability that precedes an authentic performance.

About that contorting … I used point-flying in the performance. This is a technique of navigating where you point with your hand to where you want to go and this causes you to fly there. I dislike point-flying in industrial applications of VR because it requires skill and occupies your hand. I used it in this case because I did want an unconstrained, skilful type of navigation; it allowed me to choreograph a tour of the Virtual World's asteroid along with the performance. I was shocked and embarrassed when I got lost in my own World during one of the performances! Another human element of the piece is its physicality. The Sound of One Hand is in the tradition of the theremin in that the interface is primarily physical instead of mental. Although the instruments were made of information, the music was primarily made of gesture.

The equipment I used was, for the most part, not state-of-the-art, but about a year out of date. The synthesizers and Head Mounted Display were '92 models, but the graphic engine, tracker and Data Glove were all older. I think you have to actively avoid using the latest gear in creating art, to avoid getting caught up in technology for its own sake.
The software was quite current, however. The piece was written entirely in Body Electric, a visual programming language for Virtual Reality. I am extremely fond of this software-working environment, which was designed primarily by Chuck Blanchard. You hook up visual diagrams to control what happens in the Virtual World and see the effect immediately. All the music and physics were done in Body Electric; I could never have made this thing in C.

Visually and sculpturally, the World took advantage of every trick then available for real-time rendering, including radiosity, fog, texture mapping, environment mapping, and morphing. The color "flaking" effect resulted from a bug seen when the color of hardware fog on the graphics engine was gradually changed (I set up a very slow-moving bouncing ball in the cylinder of red/green/blue color space as a chooser for the fog color). I sculpted all the parts of the World except for the illuminated skeletal hand that sprouts from the asteroid wall – which is from a Magnetic Resonance scan of a patient's hand taken at the Veterans Administration hospital in Palo Alto. It was originally used in research on surgical simulation. The asteroid is hollow, and about twelve feet in diameter, although the dense fog makes it look and feel immense. It has a big crack in the side, through which fireflies frolic, a big red ginger plant growing inside, and also a few spotlights. The instruments are generally kept inside. At one point in the performance I spun the Gimbal and flew out through the crack for a while to let the audience see how lonely the asteroid was, surrounded by absolute void.

The sounds were created on two sampler/ synthesizers. I decided not to use the wonderful 3D sound capabilities of Virtual Reality, since they are intended primarily for headphone use, and I didn't want the audience to be straining to hear something.

In many ways, The Sound of One Hand was a bigger leap into the unknown than any of the weird "experimental" performances in which I had been involved in New York in the late seventies. I had no idea if the piece would take on a mood of meaning or whether the audience would find the experience comprehensible. The performance turned out to be a cheerful, therapeutic event for me. It was a sort of technological blues, a bleak work that I could play happily. It was a chance to work on a purely creative project with the VPL family; a chance to treat all of VPL's stuff as a given set of (reliable!) raw materials instead of as work to do; a chance to practice what I preach about virtual tool design; a chance to use VR just for beauty, and a chance to be musical in front of my ridiculously political professional peer community. It was also a celebration of not having to run VPL anymore. The audiences were incredibly responsive, and I didn't hear anyone describe the piece as a demo. It was experienced as music.

World and Music: Jaron Lanier
System and Show Support: Dale McGrew
Synth. and Sound Engineer: Alfred "Shabda" Owens
Body Electric (world design and control tool): Chuck Blanchard, David Levitt
Isaac (real-time graphics software): Ethan Joffe, Chris Paulicka
World Test intern (Gimbal Spinner): Rolf Rando
EyePhones, DataGlove, development software: VPL Research, Inc.
Graphics Engine (440VGXT): Silicon Graphics, Inc.