www.aec.at  
Ars Electronica 1992
Festival-Program 1992
Back to:
Festival 1979-2007
 

 

HYENA DAYS


'Michael Saup Michael Saup / 'Steina Vasulka Steina Vasulka

(BURST MODE VERSION)
MIDI-CONTROLLED AND SOUND-CONTROLLED PICTURE PRODUCTION
Involving: Steina Vasulka, violin
Michael Saup, guitar

By converting acoustic signals from analog to digital and then interpreting their measured values, the musicians and their instruments can head towards picture-generating electronic equipment such as 3D-Graphics Workstations, Laserdisk-Player or Harddisk-Recorder and consequently can convert acoustic events into variable visual experiences.

Here, the acoustic output signals are analyzed during the A / D conversion and are interpreted by a control computer which then makes algorithmic decisions and controls further equipment:
  • by MIDI-impulses, various sound-producing machines are controlled, such as samplers, effect processors and mixing desks .


  • by the parallel interface, a laserdisk player is supplied with control impulses and a workstation by RS232, so that this equipment reacts to the interactions of the musicians.The musicians then have the possibility to use the widest variety of acoustic parameters such as tone pitch or volume as an output impulse for a visual choreography.The pictures produced in this way are then shown on video monitors via a video picture blender.
MANIPULATING EDITLISTS
My first step in controlling video pictures through music or audio signals was with the computer program XTRA.TRAX, dated 1989 / 90 which could transform STANDARD MIDI FILES (a hexadecimal size which can be imported and exported by most Software-Sequencers), into editlists for a SONY 910 Video editor. Frequency forms of music could be simulated by Dynamic Motion Control of a Betacam SP-Videorecorder and the volumes by the saturation of the video picture via picture blender. The program made it possible to rapidly and automatically convert a piece of music into video pictures. This program was able to convert the widest variety of parameters such as picture selection, transitions, wipes, etc.

By the separate translation of individual tracks e.g. bass drum, violin, piano, etc., the track-cut sequences produced could afterwards be synchronously assembled again with a "Harry".
At the same time, the program generated a conversion of the analyzed Standard Midi Files into a 3D-Wavefront model which could be animated in synchrony with the music.

The disadvantages of this system were the noninteractive work-out of the Midi Standard Files, and the advantages were that the extremely fast production of complex music-picture structures could hardly be duplicated by man.

The background recordings for "TILT IT" were produced by the process a) and were superposed with a 3D-computer animation. This animation is based on the video recording of a guitar solo. The A / D converted tone of the guitar was reconverted into a 3D-object by a C-computer program and was simulated on a Silicon Graphics Personal Iris with Wavefront software. In doing so, all the original video frames were projected, picture for picture, onto the 3D-object as a Reflection Map. Consequently, the computer animation produced is the simultaneous representation of the original picture and the original sound. In the video re-work the rendered animation was synchronized again with the original sound. A future version of TILT IT will be realized in cooperation with Frank Zappa and Mark Dippe from Industrial Light & Magic, San Rafael.
INTERACTIVE INSTRUMENTS
The next idea was to construct an instrument which could simultaneously generate sounds and pictures. This is how the installation"Paula Chimes" came to be, which was presented in 1991 at the 4th European Media Art Festival in Osnabrück. The movement of 16 steel tubes was initiated by contact or by wind and was analyzed by so-called extensometers from the industrial measured value determination sector and was interpreted by a computer using a special amplifier process with subsequent A / D conversion. The data acquired in this way were transformed by scaling in MIDI impulse for an AKAI S1000 sampler and consequently into sound events.

Parallel to this, a Silicon Graphics VGX computer produced a spatial alienation of a video still frame. The position of each steel tube had a corresponding XY coordinate on a monitor projection of a cell division (meiosis). By moving a tube, a physical wave movement was initiated on the monitor screen by real-time texture mapping, on the correspondingly allocated XY coordinate. The acoustic and visual signals produced in this way always represented the state of the tubes in time. A second monitor showed this serial information coding principle which was also present in its original form, as a meiosis projection in the form of a mathematical score. The 3D-wave program was realized by Bob O'Kane, a colleague from the Institute for New Media in Frankfurt. The amplified interface was constructed by Dieter Sellin (IfNW) and the steel object by Stefan Karp and Kai Cuse.
OUTLOOK
All the techniques mentioned will be used at the performance of the interactive concert "HYENA DAYS". Special guitars are being developed for this which make a larger spectrum of signals available. Furthermore, in the future I will implement medical sensors like blood pressure measuring equipment, EEG or eye-tracking systems into the existing structure, in order to create as much free space as possible with musical parameters for the musicians.