"Grateful Dead - Infrared Roses Revisited" is an entirely computer generated music video created by XAOS for Anubis Films. It was created to accompany a composite of music from Grateful Dead's "Infrared Roses" which was taken from live moments during a portion of their concerts known as "drums and space". Director Justin Kreutzmann and producer Gillian Grisman of Anubis Films, along with sound designer Bob Bralove, wanted to express this music visually. The piece is the closing sequence of a thirty-five minute video entitled "Grateful Dead Backstage Pass", which musically traces Grateful Dead from its inception to the present.
The sequence begins with a set of soft dissolves between establishing shots, followed by a series of longer scenes which allow the journey to unfold. In meeting the assorted demands of the project, we took the opportunity to explore our favorite technical terrain, wherein we extensively intermingle two and three dimensional processes. All of the software used was developed in-house, allowing us to do certain types of things which only we would want to do.
All of the three-dimensional modelling in the piece is done using vertex-displacement mapping. This includes not only the terrain and creatures but also the skull-ship, which is a logo of the band. By using paint software to create highly detailed elevational information, we were able to create a model notably more complex than we might have achieved using traditional approaches. The eerie creatures in the opening shots started as 2D artwork done on a Mac by Jerry Garcia for the cover of the "Infrared Roses" CD. A fractal-driven vertex displacement program gives the creatures their creepy skin movement. Every surface in the animation is textured, and many surfaces are mapped with moving imagery generated with our particle-system driven paint software. The worms and bubble scenes were created in this way, and the erosion-explosion of the planet is also an example of two-dimensional animation carried into three dimensions. Furthermore, all of the rendered frames stored three dimensional information in channels beyond RGBA, so that we could create the depth distortion seen in the first half of the trip. Of course it all comes out two-dimensionally in the end, and our overriding perspectve is that the technology doesn't really matter except as a means to create the imagery we wish to see.
Taking advantage of the audio technology built into the SGI 4D35s and Indigos on which we work, we loaded the music track directly from a DAT tape onto our hard disk storage. This gave us the temporal reference we needed to synchronize visual events with musical cues without going into post after each test render. According to Bralove, "viewers should be able to watch this video which is set to a unique soundtrack and end up with visual memories associated with sound. When they go back and listen to the album, they will have the video images in mind that are triggered by sound."
Involving the viewers in this way allows them to interact creatively, leaving them to fill in emotionally and mentally what is happening rather than having the experience dictated.
Technical Background
HW: Silicon Graphics
SW: XAOS Proprietary
The Lawnmower Man
XAOS created several scenes for the 1992 feature film "The Lawnmower Man", including those which take the viewer inside the world of virtual reality. Many of the scenes which take place in virtual reality were entirely computer generated. In others, live action images were digitally modified and combined with 2D and 3D effects to create a myriad of different looks. One such effect is the'particulation' of characters, wherein they break apart into particles which swirl into a violent vortex and disperse into thin air. Ken Pearce devised an ingenious system specifically for the film to create this effect.