www.aec.at  
Ars Electronica 1990
Festival-Program 1990
Back to:
Festival 1979-2007
 

 

A Brief Introduction to the Art of Flight Simulation


'Ron Reisman Ron Reisman

INTRODUCTION
This essay presents a brief review of flight simulation technology. The theory is presented that flight simulation is the first and still the best form of virtual reality, and that the technology of flight simulation can be considered "lessons learned" and applied to other virtual reality endeavors. Special attention is paid to techniques for inducing the sensations of acceleration.
EARLY HISTORY
The value of practicing aviation skills on the ground, where one could learn from mistakes that might otherwise have fatal consequences, was realized at the dawn of aviation. By 1910 there were several flight trainers actively in use, though these were a far cry from modern flight simulators. For instance, the Sanders Teacher was an actual aircraft which was linked to the ground. The Billings Trainer, on the other hand was not a real aircraft; instead it was a non-flying device with wings which was mounted on a column. A rudder bar enabled the machine to be rotated to face the wind and the student operated control surfaces to maintain his equilibrium, somewhat similar to certain present-day surfboard trainers. In the Walters Trainer and the Antoinette "Apprenticeship Barrel" Trainer, flight instructors physically disturbed the machine by rocking it back and forth to simulate the motion of a plane in flight. From these primitive beginnings much was learned about the engineering and use of devices which simulated flight motion; such devices are called "motion-bases" (Rolfe and Staples, 1986).

During World War I several motionbase trainers were invented, notably by Reid and Burton, Ruggles (1917), and Lender and Heidelberg (1917). Though each was ingenious in its own way, they were still fairly primitive. The post war period spawned a number of early successes, with significant advances by Rougerie (1928), Roeder (1929), Johnson (1931), and Jenkins and Berivn (1932). But the contributions of Edwin Link were, by far, the most significant. Link used the family business, the Link Piano and Organ Company of Binghamton, NY, to launch his aviation trainer company. He used pneumatic mechanisms from organs to create motion bases in the 1920s, and the design was successful enough to be employed for over twenty years. Link created the premier flight simulation enterprise, dominating the field until the 1970s.

WWII was the watershed of simulator development; during those years Link made over 10,000 flight trainers for the U.S. military, and confronted a variety of challengers that advanced the technologies of both flight simulation and the embryonic computer sciences.

At this time all simulators employed analog computers. In an effort to employ digital techniques for simulation, the US Navy/MlT Servomechanisms Laboratory developed the Airplane Stability and Control Analyzer (ASCA, 1943) which led to the Whirlwind Computer. Subsequently the U.S. Navy/University of Pennsylvania/ Sylvania Corporation developed the Universal Digital Operational Flight Trainer (UDOFT, 1950-1960) to demonstrate the feasibility of digital computers for flight simulation. These advances led to the Link Mark I computer (early 1960s), the first commercially successful real-time digital computer for flight simulation.

Modern motion-base technology made its appearance with Rediffusion's pitch motion system for a Comet IV simulator(1958), and the first use of real-time computer graphics for simulation was the GE Computer Image Generation systems (mid 1960s) for the U.S. space program. By the end of the l960s the age of "modem" simulation was underway.
MODERN SIMULATION FUNDAMENTALS
It is beyond the scope of this essay to detail the variety of flight simulation implementations, but an attempt will be made to outline a few features of modem simulation technology.

Figure 1. (ACS7 0271-18) shows the interior of a Singer-Link Phase II Boeing 727 Flight Simulator located at the ManVehicle Research Facility (MVSRF) at NASA Ames Research Center (Nagel & Shiner, 1983; MVSRF, 1983). The term "Phase II" is a legal classification which signifies that the time spent training in such a simulator is equivalent to time spent in an actual aircraft, and this time can be used for qualifying for a license. For instance, a pilot may spend 440-odd hours training in a Phase II simulator, perform three take-offs and landings to the satisfaction of a check-pilot, and be awarded a certification which allows him to pilot a commercial aircraft loaded with passengers. During the training time in the simulator the crew will encounter a wide variety of potentially hazardous situations which will require execution of "abnormal checklist" procedures. For instance, the crew will typically encounter simulated electrical fires, single engine disintegrations on take off, wind-shears, etc. It is obvious that the cues encountered in such situations must be sufficiently realistic that the habits learned in the simulator can be usefully transferred when the crew is in an actual cockpit.

Figure 2. (AC84-0342-3) shows the interior of the Advanced Concepts Flight Simulator (ACFS), also located at the MVSRF. As opposed to the B-727 simulator, the ACFS simulates an aircraft which has never existed, and will not be built in the future. Instead, the ACFS is a test-bed simulator for evaluating and exploring concepts associated with the automation features that will be found in future commercial transport aircraft. As designers put more microprocessors in the flight-deck, the workload of the pilots may shift from actively piloting the aircraft to managing and monitoring the aircraft's system. Fundamental differences in the nature of the tasks, and the conflicts between design and practice, may not always be clearly understood (Wiener & Nagel, 1988). For instance, the crash of an Airbus A320, with 130 passengers aboard (three of whom perished), on June 26, 1988 at the Mulhouse Habsheim aerodrome, is thought to have been attributable in part to the pilots' misunderstanding of the capabilities and operation of the computerized fly-by-wire system (AviationWeek, 1990). The ACTS is designed to explore such issues, and hopefully uncover subtle problems, via full mission simulation technology.

As these examples demonstrate, realism is of paramount importance. In designing realistic simulations a primary design guideline, to a degree rarely found in other technologies, is the dictum "Man is the measure of all things" reigns supreme in simulation. Modern simulation technology employs human factors research and control-theory to determine how to best train pilots for the situations they will encounter in flight. Since it is obviously impossible to duplicate the physical motions of aircraft with ground-based simulators, much simulation technique places emphasis on duplicating perceptual rather than physical realism. The concept of the "cue" is paramount; we know the world via our senses, and the features analysis of the world can be defined as perception of a collection of cues. For example, on the grossest scale: darkness is a visual cue for nighttime, and a visual display of a horizon line can be used as a cue of pitch (up/down motion) or roll (rotational motion). If a pilot is presented with unrealistic cues, however, the pilot could develop inappropriate habits in the simulator that could actually endanger him when real-life emergencies occur. The goal of simulation, therefore, is to analyze real-life cues and simulate them in a manner so that a pilot can practice appropriate behavior. One necessary component of simulations is that all events must occur in real-time. Real-time, in simulations, is usually an empirical definition which refers to the ability of the simulation to continually update the cues at a sufficiently fast rate to meet or exceed the pilots' perceptions of changes in the cues. For instance, if a pilot attempts to land an aircraft based on cues provided by a visual simulation (i.e. an "out-the-window" display) the scene must usually be updated at a high enough rate to provide meaningful feedback. Typically this means that the out-the-window display must generate 30 frames per second G. e. 30 Hz.). If, for instance, the display only updated at 15 Hz., the tag between the pilot's input to the controls and the effect that he would notice in the simulated world would eventually cause him to overcompensate, which would cause the pilot to introduce oscillations into the control of the aircraft that would not arise in the real world. This phenomena is called "pilot-induced-oscillation", or PIO. If a simulation induces PIO it is obvious that it will not be suitable for training critical responses.

For this reason, simulation computing is, above all, real-time computing. It is worth noting that in academia a great deal of time is typically spent studying techniques of structured programming, high-order languages, and sophisticated operating systems. While it is true that such software technologies may be boons for program "readability" and 'software maintenance", it is also true that such techniques are often computationally expensive and thus slow engender slow, non-realtime performance. Thus, simulation software engineers commonly practice decidedly unstructured programming techniques and use low-level languages; they often avoid powerful, modern operating systems so that the code will be "closer" to the machine; they refine techniques of "control-loading" and "real-time executives" (Harris, 1980), which are rarely taught in the majority of computer science curricula. It is worth noting that these simulation computational techniques seem to be rarely found in "virtual reality" systems, perhaps because VR systems are often linked with academic environments.

Another facet of modern simulation is the reliance on various exotic optical (as opposed to computational) technologies. The variety of optical systems for simulation is astounding (LaRussa, 1980). Almost all these systems employ the technique commonly called "infinity-optics"; i.e. when presenting an on-the-window display, the computer generated image passes through a set of collatingoptics so that the virtual focus of the image is greater than 10 meters.
Infinity optics enables the viewer's eye to effectively focus to infinity, even though the actual image may only be centimeters away.

The aerospace industry has a long history with infinity optics. When World War II fighter aircraft fought, they employed early infinity optics in their boresights so that the pilots would be able to have both the adversary aircraft and the gunsight reticle in focus at the same time. This technology evolved into the modern heads-up display, which in turn evolved into the modern headmounted display. An example of this technology is the Honeywell Integrated Helmet and Display Sighting System (IHADSS), currently standard equipment on the Apache helicopter (Honey well, 1988). In addition to providing a helmet mounted display, IHADSS employs infrared real-time helmet-tracking devices so that the display can be reliably coordinated with the attitude of the pilot's head, even under the harshest conditions. Similar devices have been implemented for a variety of applications: visually-coupled sensor systems, head-directed weapons targeting, head-mounted instrument displays, and, of course, head-mounted displays for flight simulation (Cook, 1988; Haworth, et al, 1988).

CAE currently markets the Fiber-Optic Helmet-Mounted Display (FOHMD). The FOHMD installation at the Army Aviation's Crew Station Research and Development Facility, (CSRDF) at NASA Ames Research Center is pictured in Figure 3. (AC86-0089-2). The CSRDF has extremely impressive capabilities. For instance, the CAE FOHMD provides the pilot with an instantaneous field of view of 125 horizontal x 64 vertical, with an unlimited total field of view. The primary computer graphics device at the CSRDF is the General Electric CompuScene IV, which can generate 4,000 anti-aliased, texture mapped polygons for each eye, for each frame, at a rate of 60 frames per second, and with a resolution of 1024 x 1024 pixels. Silicon Graphics workstations provide additional imagery, which are displayed in a binocular high-resolution inset. The FOHMD uses four General Electric light valve projectors as image sources. These transmit the images via relay-combining optics to the input of the fiber-optic cables. The fiber-optic cables connect to the light-weight helmet display, which is composed of an array of infra-red LEDs for optical headtracking, an accelerometer package for compensation of angular motion, and the semi-transparent wide-field reflective infinity display optics which provide overlapping, full-color images to each eye. The optics use half-silvered mirrors as optical combiners, and sophisticated 3 inch diameter eyepieces.

The CSRDF performance far exceeds most other head-mounted display systems. Whereas many Virtual Reality projects are compromised by limitations of budget and/or mass-market technology, the CSRDF has a mission which requires state-of-the-art helicopter simulator performance. For instance, when flying a helicopter at low altitude, the pilot must often whip his head around, rely on peripheral vision, and may use cues such as the motions of grass leaves in reaction to the wash of the helicopter blades. The CSRDF attempts to provide a simulator in which pilots can practice in a realistic fashion. Unacceptable compromises in simulation performance could conceivably contribute to the loss of billions of dollars, battles, national security, and human lives. For this reason, the C SRDF budget is measured in millions of dollars, and the objective is to only accept as few compromises as state-of-the-art allows. For instance, each of the two eyepieces is valued at approximately 250.000 dollars.

Most VR systems currently use comparatively low resolution LCD headmounted displays and cannot achieve realtime performance (i.e. 30 frames per second). The continuing rapid progress of the electronics industries and the resulting benefits of economics-of-scale may lead to a low-cost head-mounted display technology which will be comparable to the FOHMD system in a few years. In the meantime, simulation researchers in laboratories such as CSRDF are addressing issues which require realtime, wide field-of-view, high-resolution displays. Hopefully the simulation techniques may be transferable to future VR systems.

As an example of how the uncompromising spirit of flight simulation has pushed the state-of-art, let us consider a method for inducing the sensation of acceleration. As noted above, much simulation design guidelines often place more emphasis on simulating perceptual rather than physical realism. In the VR community there is currently a great deal of speculation on force-feedback devices. Such technology is somewhat mature in flight simulation.

Part of this technology involves definitions and analysis of sensory mechanism modeling* (Borah, et al, 1979, Martin, 1980). Let's define perception as the pickup of cues (information) from the outside environment. Proprioception is defined as the pickup of information from within the body. Vection is the perception of motion. Kinesthesia is the awareness of the orientation and rates of movement of different parts of the body. The three principal perceptual systems which relate to motion and force are the visual, the vestibular, and the haptic. The eyes are the input-apparatus of the visual system. The apparatus of the vestibular system are the complex structures composed of two sacs and three semicircular canals that can be found behind the auditory portion of each ear. The apparatus of the haptic system includes the combined inputs of the skin and joints: Pacinian Corpuscles, Merket'sDiscs, Ruffini's Endorgans, muscle spindles, Golgi tendon organs, and other organs which make up "Muscle Sense".

The primary source of steady motion is the visual system. A good out-the-window display, with no other cues, is capable of inducing a prolonged perception of linear motion (linear vection) and self-rotation (circular vection). In order to be effective two guidelines must usually be followed: (1) the peripheral, rather that the central visual fields, must be stimulated; (2) background stimulation is more important than foreground stimulation.

Usually, however, there is a delay of 5 to 10 seconds from the onset of the visual stimulation to the perception of motion if no other perceptual systems are stimulated (this is highly variable). This delay can be shortened (sometimes almost eliminated) if vestibular system stimulation is coordinated with the visual cues. Brief vestibular cues in the appropriate direction hasten the onset of vection, and vestibular cues in the conflicting direction delay the onset (Young, et at, 1973).

The organs of the vestibular system can be thought of as the body's accelerometers. Vestibular stimulation can invoke sensations of angular velocity; if an appropriate visual display system is available, it is usually only necessary to "start" the vestibular cues; the visual system can take over from there.

The haptic system is important for signaling the continuous states of deformation of the skin and the deeper tissues and for recognizing the orientation and rates of movement of the different parts of the body.

Modern simulation designs are often directed at and by the above perceptual models.

For instance, computer generated imagery systems are often designed to provide slightly unrealistic peripheral and background cues in order to create a more realistic impression of movement. NOTE: If a visual system is not operating in real time (30 Hz.) it will almost never be effective for creating vection. It is usually not desirable to tolerate the 5 to 10 second delay associated with the visually-induced vection. This is one of the reasons most simulators go to great lengths and expense to create exotic motion systems. As noted above, there is no hope that ground-based motion systems can duplicate the actual motion of a craft in flight. Instead, the motion systems are often used to stimulate the vestibular system in coordination with visual stimulation. The motion systems introduce sharp "jerks" in a given direction to display the higher frequency "onset" portions of the acceleration profiles, then rapidly slow as the motion base approaches the limit of its movement. The cockpit is then gently brought back to a "neutral" position so the motion system will be able to effectively display the next "onset" cue, This process is known as washing-out the motion cues, and there is an entire literature devoted to wash-out techniques and wash-out equations (Jex, 1978; Dorman and Riedel, 1979; Parrish, et al, 1973).

It is beyond the scope of this paper to attempt to do justice to the technology of motion systems and the variety of motion base hardware, such as synergistic platforms, or "six-post" motion bases, or suspended motion platforms, or beam-type motion simulators. Consider, for example Figure 4. (AC89-0234312), which is a retouched photo showing the relative size of the Vertical Motion Simulator (VMS) at NASA Ames Research Center. The VMS is a cascaded platform motion simulator, i.e., one moveable platform is stacked upon another. In the case of the VMS, a 4 degree of freedom motion base supports the "cab" (which contains the simulated flight deck and pilots). This motion base rides along a massive girder, which in turn can be rapidly elevated and "dropped" through the height of the ten-storey high enclosure. Figure 5. (AC79-0126-1) is an unretouched photo of the inside of the VMS. Note the man peeking out of the window closest to the cab for an indication of scale.
Finally, refer to Figure 6. (The six line drawings), (Kron, 1980; Cardullo, 1980; Kron et al, 1980). These are six devices that are used in simulators of high performance tactical aircraft in which the normal flight regime includes very high, sustained acceleration levels (also known as: "pulling Gs"). The principal of using a motion base to "wash-out" motion means that only the leading edge accelerations can be simulated; obviously such a technique is inadequate when attempting to simulate Hi-G flight. As a result a variety of techniques are used to complement the motion bases and to impart a perception of sustained acceleration. These devices primarily stimulate the haptic sensory elements, such as the deep flesh pressure sensors, the cutaneous sensors, skeletal attitude and muscle sensors.
For instance, in the upper left hand corner is the Link G Seat. This product of Link Flight Simulation simulates salient characteristics of hi-G flight by applying pressure to the seat pan, the back rest, the thigh panels and lap belt. As in the other devices in this figure, the G seat responds to a set of control laws that take inputs from the air frame and engine simulation models, which in turn face input from the flight control models, which in turn are controlled by the pilot. As the aircraft "pulls Gs" the pilot is pulled into the seat and the seat presses the pilot in the appropriate places. Another variation can be seen in the upper right corner of the figure: the Seat Shaker, which can be used to simulate: touchdown, bump, stall buffet at high G or high angle of attack, landing gear Up/Down/Lock, runway imperfections, flap vibrations, landing gear rumble, afterburner and engine vibration, speed brake, gun firing (limited to 20 Hz.), and (of course) Crash.
The middle-left side of the figure shows a mechanism for pulling down the face mask. The mechanism employs a miniature torque motor and windlass harness housed in a modified G-suit. The middle-right side of the figure shows a mechanism for inducing torque at the shoulders and elbow and appropriately loading the upper and lower arms. This device can simulate the loss of control the pilot would feel under hi-Gs; it is fitted with several safety precautions to decouple it in case the simulation becomes dangerously realistic.

The lower left corner of the figure illustrates how a firmness bladder can be placed in a flight boot to simulate increased G loading in one area of the body, in isolation from other areas.
Finally, the lower-right corner of the figure shows a cable boom and drogue (CBD) head-jerker. The cables are driven by helmet-mounted torque motors and are attached to seat back and harness strapping. The CBD is composed of a helmet with an active drogue pylon and ultrasonic transmitters, a lariat for capturing the drogue, and a drive boom capable of jerking the pilot's head forward, backward, right, and left.
Conclusion
This article has presented a brief introduction to flight simulation for a nontechnical audience interested in Virtual Reality. It is hoped that those with aspirations for constructing VR systems may find some of the concepts useful, and may in fact delve deeper into simulation technology to avoid "re-inventing the wheel".

Among aerospace engineers, flight simulation is often regarded as a "black art", relying more on technique, mirrors, and smoke than on traditional science. For that reason, if for no other, an awareness and appreciation of flight simulation technique should be part and parcel of artists interested in VR.

References

Aviation Week & Space Technology, "Report of the French Ministry of Planning, Housing, Transport and Maritime Affairs Investigation Commisson, Part 6." 23 July, 1990, 90-93.
Borah J., Young, L.R., Curry, R.E., Sensory Mechanism Modeling (Final Report) AFHRL-TR-78-83, Air Force Human Resources Laboratory, Brooks AF13 TX; February 1979.
Cardullo, F.M., "High-G-Simulation", Flight Simulation Short Course, 11-14 March, 1980.
Cook, A.M., "Tbe Helmet-Mounted Visual System in Flight Simulation". Presented to the 1988 Royal Aeronautical Society Conference, London, England, 12-13 April, 1988.
Furness, T.A., Kocian, D.F., "Putting Humans into Virtual Space", Armstrong Aerospace Medical Research Laboratory, Wright Patterson Air Force Base, OH.
Harris, E., "Simulator Systems Engineering: The Computer Allocations Process", Singer-Link Flight Simulation, 1980.
Haworth, L., Bucher, N., and Henessey, R.: "Wide Field of View Helmet Mounted Display Systems for Helicopter Simulation", AIAA Flight Simulation Technologies Conference, 7-9 September, pp. 1-9,1988.
Hannan, L.G., Riedel, S.A.: Manned Engineering Flight Simulation Validation. Part l: Simulation Requirements and Simulator Motion System Performance FFDL-TR-78-192, Part 1, Air Force Flight Dynamics Laboratory, Wright-Patterson AFBOH; February 1979. Honeywell Military Avionics Divison "IHADSS: Integrated Helmet and Display Sighting System – Familiarization Guide", 14 September, 1988.
Kron, G.J., Cardullo, F.M., Young, L.R.: "Study and Design of High G Augmentation Devices for Flight Simulators" Report F 33615-77C-0055, Jan. 1980. Prepared for Human Resources Laboratory, Wright-Patterson Air Force Base Ohio.
Kron, G.: "G Seat, G Suit, and Shaker Simulation", Flight Simulation Short Course, 11-14 March, 1980.
LaRussa, J.: "Visual Display Systems", Farrand Optical, 1980.
Lypczewsk, P.A., Jones, A. D., and Voorhees,). W.: "Simulation in Support of Advanced Cockpit Development", AIAA Flight Simulation Technologies Conference, 17-19 August, 1987, pp.134-145.
Man-Vehicle Systems Reserach Division: "Man-Vehicle Systems Research Facility: Design and Operating Characteristics", NASA TM-84372, May, 1983, NASA Ames Research Center.
Martin, E.A.: "Motion and Force Simulation Systems", Flight Simulation Short Course, 11-14 March, 1980.
Nagel, D.C., and Shiner, R.J.: "The Man-Vehicle Systems Research Facility: A New NASA Aeronautical R & D Facility", AIAA Flight Simulation Technologies Conference, 13-15 June, 1983, pp. 130-139.
Parrish, R.V., Dieudonne, J.E., Bowles, R.L., Martin, D.J.: "Coordinated Adaptive Washout for Motion Simulators" AIAA.
Rolfe, J. M. and Staples, K. J., Flight Simulation (Cambridge University Press: Cambridge, 1986).
Wiener, E. J., and Nagel, D. Human Factors in Aviation, (NY: Academic Press, 1988).
Young, L. R., Dichgans, J. M., Murphy, R., Brandt, T.: Interactions of Optokinetic and Vestibular Stimulation Motion Perception. Acta Otolarying, 76, pp. 24-31, 1973.