www.aec.at  
Ars Electronica 1990
Festival-Program 1990
Back to:
Festival 1979-2007
 

 

What´s New in Reality Built for Two


'Chuck Blanchard Chuck Blanchard / 'Ann Lasko-Harvill Ann Lasko-Harvill / 'Lou Ellyn Jones Lou Ellyn Jones

INTRODUCTION
The year since June 7,1989, when VPL Research first presented the RB2 Virtual Reality System, has been marked by significant progress in two areas: technological development, and the increasing understanding of the realworld applications of Virtual Reality. This paper deals with some of the most important developments in each of these categories: the incorporation of three dimensionally localized sound, the continued development of VPL's software used to build and animate the virtual worlds, and the growth of networked virtual reality.
A BRIEF INTRODUCTION TO VIRTUAL REALITY AND THE RB2 SYSTEM
Perhaps the easiest way to understand virtual reality is to compare it to the physical world. Physical reality is the thing one finds on the other side of the sense organs: eyes, ears, and skin. A virtual world is perceived when computerized clothing is worn over the sense organs. When one talks to someone on the phone, he or she is not having an "artificial" experience, even though the other person's voice has been broken down into a signal and sent via wires or even satellites. It is a real conversation using telephone technology. Similarly real work and real communication take place in Virtual Reality.

RB2 (Reality Built for Two) is so named because it is the first Virtual Reality system to accommodate more than one person at the same time. Consisting of hardware and software, RB2 is a full authoring system, allowing users to design and experience their own environments interactively.

The RB2 hardware consists of the Data-Glove™ Model 2, a hand motion input device, the EyePhone™, a head-mounted stereo visual and audio display, a control workstation, and two Silicon Graphics IRIS™ computers, which render the world in real-time: one for each eye. The Convolvotron, a special device which models 3D space acoustically has recently been integrated into the RB2 system.

A key to the reality of Virtual Reality is real-time tracking of the user's movements. Both visual and audio displays are continuously updated to reflect the user's every move. Turning one's head, one sees and hears the world from a new perspective.
THREE DIMENSIONALLY LOCALIZED SOUND IN VIRTUAL REALITY
The RB2 System now incorporates 3D localized point source sound. The same head tracking which keeps the visual display continuously updated with any movement of the head, also operates to produce sounds as though they emanate from specific objects. The user can hear a voice behind them, and turn around to find the speaker.

The hardware which creates the 3D sound effects in the R2 system is the convolvotron. The Convolvotron performs acoustical modeling of both the physical space and the shape of the ears (Pinae) themselves.

Sound data can be reproduced faithfully from recorded music, live voices, sampled sources. Sampled sounds can be modified by Body Electric using MIDI (Musical Instrument Digital Interface). For example MIDI is used to create a dopplershift, and proximity (amplitude) effect in a race car world while sound location of the cars relative to the virtual driver is produced by the Convolvotron.

Tying the audio and visual senses together creates a powerful conviction of reality, heightening both the user's sense of being in the space, and their ability to function within the space. Conference calls on conventional telephones are difficult to manage. Visual and audio spatial cues have an additive effect in Virtual Reality, helping to identify the speaker, and making it much easier to maintain natural interaction.

Sound's high information density, and our sensitivity to its nuances, pitch, loudness, filtering, stereo etc. help to establish the relationship between the user and the environment. Sound can provide important cues for task performance.
NETWORKING IN VIRTUAL REALITY
With the advent of networking (the ability to include more than one person in a virtual environment simultaneously) the scope of applications for virtual reality is increased dramatically: Virtual Reality becomes a tool for communication, collaboration, and pooling of resources.

Multiple participants, each wearing an Eyephone and DataGlove, experience themselves as being in the same environment together whether they are physically in the same room or across a continent. The system continuously monitors movement of each participant and updates all sensory displays so that each shares the current state of the world moment by moment.
IDENTITY AND MASK IN VIRTUAL REALITY
In the virtual world each participant is represented by a particular body image which others recognize and with which they interact. Self-representation as well as the representation of the surrounding environment is specific to the application. Tele-operations may be enhanced by projecting the user's identity directly onto the device controlled, while communications tasks may be shaped by expressive (or purposefully neutral) images assumed by participants.
NETWORKING APPLICATIONS
Once networked in virtual reality, individuals can contribute to a group effort, seeing and responding to the contributions of others as they are made. For example, scientists in fields where data is inherently 3D, as in chemistry, aerodynamics and geology cannot only visualize their model, but enter into and manipulate it as though it were physically present.

Engineers can develop a design together by modifying a virtual prototype. Architects, city planners and product designers could collaborate on designs by manipulating aspects of the virtual environment to show others the effect. In medicine, a surgeon can use a Virtual Reality model of the patient's body to help plan surgery with the surgical team. For additional background, a surgeon could preview a delicate operation in Virtual Reality with a specialist in that particular field. In cognitive science, researchers could quickly construct experiments in Virtual Reality for review and modification by colleagues.

Users at a distance can communicate ideas using speech or other auditory feedback including sound effects and music. Virtual Reality also lends itself to gestural communication, including various codified or impromptu sign languages and formal graphic communications, including charts, graphs and their three dimensional equivalents. Another type of communication is the manipulation of the virtual environment itself. An example would be moving a window in Virtual Reality to demonstrate an architectural idea.

Virtual Reality can be itself a distinct, and powerful, art form. It contains characteristics of performance art, graphics, and various other interactive media, yet it is also unique. Participants can create and perform in their own productions.

Application of Virtual Reality to video games is so natural it almost need not be mentioned. Thought should be given however, to the intricacies of complex team, collaborative, and competitive games which will be developed.

Participants with varying abilities can interact, work, play and create using individualized input and sensory display devices which give them equal abilities in the virtual environment.
NETWORKING TECHNOLOGIES
Many communications technologies may be employed in the RB2 Network System. Ethernet has been the mainstay of the system as currently configured. VPL has developed data compression techniques and algorithms to utilize conventional telephone lines, or potentially, satellite links. The RB2 System has also been marked as a prime application for ISDN broadband network communications.
BODY ELECTRIC MAPPING BEHAVIOR IN RB2
Besides modeling appearance, the design of a virtual world involves mapping user action onto the model. In the simplest case this might mean the mapping of output from a DataGlove sensor on to the equivalent finger of a virtual hand. A slightly more complex example might be direct operation of a robot end-effector with glove input conditional on other limiting factors designed into the system.

Action may be exaggerated or minimized in the mapping process. For example, the same gesture may invoke a broad sweeping motion for crossing large distances, but may be modified to produce small motions for fine adjustment when they are needed. User input can be concatenated with input from other devices, including feedback from the physical environment. Logical processes and constraints can be modeled in the same way that motion is, so that the system can act on itself.

The software used to design and control the constant flow of data from the real-time hardware inputs to the living Virtual Reality model is called Body Electric. Always a graphical and interactive program, in the past year Body Electric has developed a new interface which lets users route connections in an even more intuitive and visual manner.
DESCRIPTION OF THE REALITY BUILT FOR TWO SYSTEM (THE RB2 SYSTEM)
The RB2 System is an integrated system of hardware and software: it consists of individual input and display devices (DataGloves, Eyephones, trackers, even DataSuits) coupled to a design and control workstation and powerful rendering engines. The interlocking software applications create and sustain the virtual world.

The Eyephone is a head-mounted device consisting of twin LCD (liquid crystal display) screens that completely cover the eyes. The screens are offset from each other by 6 degrees so that the user gets a binocular view of the virtual world, i.e. a 3D view. The image on the screens is updated in real-time and corresponds to the movement of the user's head. For example, when the user moves his/her head to the right, the view changes to show what's to the right of the user in that virtual world. Also integrated into the Eyephone are earphones that deliver 3D sound to each ear.

Mounted on top of the Eyephone is a Polhemus sensor, a magnetic antenna that tracks the head's position and orientation in space relative to a fixed source of magnetic radiation located in the room. The Polhemus system tells the computer where the user is in 3D space and how he or she is moving (i.e. whether the head turns to the right or left, looks under a table or travels three feet away).

The VPL DataGlove lets the user pick up objects in the virtual world as if they were real. By holding the hand in front of the face, the user sees a virtual hand that mimics hand gestures performed in real life.

The DataGlove senses the hand's position and orientation in space as well as how much each finger flexes. Position and orientation (pitch, roll and yaw) is measured, as with the Eyephone, by a Polhemus sensor on the back of the hand. Fiber-optic sensors affixed to the finger and hand joints measure the amount of bend in each joint.

VPL also makes a DataSuit for measuring motion of the human body. The DataSuit is a one-piece, full-body stretch garment with sensors for measuring arm, leg and trunk movements. The DataSuit uses a variety of sensors to measure joint bend and hip and shoulder rotation. Polhemus sensors are placed on the head, back and pelvis to measure rotation or bending at the spine as well as absolute position of the figure in space (i.e. whether the figure has moved forward, backward or crouched down). The DataSuit incorporates left and right DataGloves and sandals with pressure sensors in the heels which measure contact with the floor.

VPL currently uses Silicon Graphics IRIS rendering engines (one for each eye), to paint the image on the Eyephone screen in real-time. The Design/Control-Workstation is an accelerated Macintosh II with an Ethernet Card to interface with the Iris computers.
SOFTWARE
The RB2 System uses three interlocking software applications to design and create virtual worlds.

RB2 Swivel is an extended version of Swivel 3D, VPL's color solid modeler for the Macintosh. Swivel is used to design and place objects in a 3D environment. Objects can be constrained in position and orientation, and given ranges for each degree of freedom. To have a natural-appearing virtual elbow, for instance, the user would constrain elbow bend to one degree of freedom and limit range of motion to about 160 degrees.

Body Electric animates the artwork created in RB2 Swivel. Running on the Macintosh, Body Electric collects data from external sources (such as Data Suit, DataGlove and Eyephone sensors, MIDI and other sources), messages the data, and broadcasts updates to the virtual world via Ethernet. Body Electric accepts RB2 Swivel files as 3D wireframe models which can then be edited and merged into an animated, complex environment. When a user wearing a DataGlove clenches the fist, Body Electric takes the raw data provided by glove sensors, and transforms it into a realtime animation of a clenched hand in the appropriate location in the world.

Body Electric enables interactions with objects in virtual reality and defines behavior of those objects. For instance, users can grasp virtual objects, or pull a virtual switch to sound a train whistle. Body Electric enables virtual clocks to tick and virtual birds to fly around a room. Users can change behavior or interactivity while a virtual world is running and immediately see the effect. Body Electric enables the user to record and playback sequences. It operates in an intuitive visual programming language which does not require prior programming experience.

ISAAC, running on the Silicon Graphics IRISes, renders the virtual world in realtime. Isaac accepts models from RB2 Swivel through Ethernet, and assembles a linked model of the virtual environment. ISAAC creates the virtual world in the form of Gouraud-shaded polygons, which it renders on the Eyephones or, for onlookers, external monitors. ISAAC also defines the machine configuration, the scale of the virtual environment, the identity and perspective of the viewing objects, and the intra-ocular distances for each Eyephone.
CONCLUSION
Significant new aspects of Virtual Reality at VPL are developing in many areas: Panoramic video backgrounds have recently been incorporated in the RB2 System, opening a new realm of realism, and a range of graphic capabilities. Higher resolution Eyephones are in development based on new display technologies just becoming available. New rendering engines are making more complex virtual worlds. The next year will be a critical one not only for development with VPL but in the community of virtual reality researchers.

Literature:

1. BLANCHARD, CHUCK ET AL, "Reality Built for Two: A Virtual Reality Tool, Proceedings of the ACM Snowbird conference (Feb. 1990)
2. FISHER, S. S., AND JANE MORRILL TAZELAAR, "Living in a Virtual World" Byte Magazine TV 215-221 Only 1990)
3. FOLEY, JAMES D., "Interfaces for Advanced Computing" Scientific American pp 126-135 (October 1987)
4. HARWOOD, JIM "Agog in Goggles: Shape of things to Come Reshaping Hollywood's Future" Variety 56 Anniversary Issue pp 66-74 (1989)
5. LASKO-HARVILL, ANN ET AL, "From DataGlove to DataSuit" Proceedings of the JEEE.COMPCON., 1988 pp
6. LEVY, STEVEN, "Brave New World" Rolling Stone Magazine pp 92-100 (July 14,1990)
7. SUTHERLAND, W.R. "The Ultimate Display", Proceedings of the IPIP Congress 2, pp 506-508 (1965)
8. WRIGHT, RICHARD, "The Information Age: Virtual Reality" The Sciences V27, N6 pp 8-10 (1987)