Audiopad
'James Patten
James Patten
/
'Ben Recht
Ben Recht
Audiopad is a composition and performance instrument for electronic music that tracks the positions of objects on a tabletop surface and converts their motion into music. One can pull sounds from a giant set of samples, juxtapose archived recordings against warm synthetic melodies, cut between drum loops to create new beats, and apply digital processing all at the same time on the same table. Audiopad not only allows for spontaneous reinterpretation of musical compositions, but also creates a visual and tactile dialogue between itself, the performer, and the audience.
The performer uses three types of objects to make sound on the Audiopad table. The “tracks” represent different pieces of the musical composition, for example melody and percussion. Each of these has a set of associated musical samples. The “microphone” controls the volume of each track: tracks that are closer to the microphone are louder than those that are far away. This spatial mixing metaphor lets the performer control the volume of many different tracks at the same time in a way that is difficult with a bank of knobs or sliders, and impossible with a computer keyboard and mouse. The performer uses the third type of object, the “modifier,” to change how each of the tracks sounds. The modifier can control digital effects and select new samples for each track.
As one moves these objects, graphics on the table reflect what the performer is doing. For example, each track is surrounded by a spinning colored arc which changes to reflect the volume and tempo of the track. Samples related to the current sample (as determined by the composer) are displayed on an arc near the track. The performer can quickly switch between these related samples by moving the track between them. In this sense, Audiopad starts to become an embodiment of the composition being performed, rather than just an instrument with which to perform. The interaction between the graphics, the objects on the table and the performer give the audience a view into how the performance is realized.
Audiopad uses a matrix of specially shaped antenna elements built into a horizontal sensing surface that is built into a table. The resonance of these antenna elements can determine the position of a group of LC tags. Each LC tag is a small coil of wire attached to a capacitor that resonates at a specific RF frequency. Software running on a PC translates the position information from the antenna elements into graphics on the tabletop, and MIDI commands for various pieces of synthesizer software. The graphics are displayed using a video projector pointed down at the surface of the Audiopad table. The software that tracks the positions of the LC tags is written in C and C++. The remainder of the software is written in Python, and uses OpenGL for graphics.
In creating Audiopad, our aim was to combine the modularity of computer-based synthesis with some of the expressive power of traditional musical instruments. In addition, we wanted to create something through which audiences could begin to understand a process through which electronic music can be performed. Finally, we aimed to make Audiopad reconfigurable. To prepare for each performance, we rewrite portions of the Audiopad software to experiment with new performance techniques and tailor the software for the composition to be performed. In the context of the Audiopad installation at Ars Electronica, some of these interactions are not immediately apparent to one viewing the piece. Rather they reveal themselves gradually as one interacts with Audiopad.
Audiopad has been developed at the MIT Media Lab
|