Computer Vision refers to computerized image recognition and image processing. Essentially, this means that the computer can perceive and register what’s going on around it and react appropriately. A machine’s capacity to see and recognize objects differs greatly from a human being’s sensory perception and comprehension abilities. Computer Vision constitutes a basis technology for numerous applications in the field of augmented reality and man-machine interaction. A familiar example is a turnpike toll plaza set up with computer equipment that can read the license plates of vehicles with a paid-up annual pass.
Interactive installations by artists and presentations on the 1st Upper Level highlight the possibilities that digital technologies open up to expand the radius of human action. Throughout the Ars Electronica Center’s three levels, visitors can experience and experiment with innovative ways to creatively assert control over a physical space.
For example, images, gestures and movements in a certain defined area are made visible and enlarged in real time by synthetic graphics. The visitor or performer experiences a reciprocal interplay among real actions and the staged graphics and/or sounds. The result is a sort of reflection and expansion of perception via interaction—for instance, when virtual worlds are enriched with real-world imagery (augmented reality). This interpretation of real motions and movements in connection with software that analyzes emotions is also used in numerous surveillance systems.
Constantly increasing traffic and a burgeoning network of streets make driving more and more difficult, and especially so in unfamiliar cities or foreign countries. Navigation systems offer a solution to this problem.
Now, Ars Electronica Futurelab engineers have taken this technology to the next level with INSTAR (Information and Navigation Systems through Augmented Reality) featuring augmented reality, a hybrid form of reality and virtuality that goes beyond the capabilities of two-dimensional visualization methods. A display on the driver’s-side dashboard shows a real-time video image of the driver’s current view of the road overlain with a graphic depiction of the suggested route to the specified destination. INSTAR thus offers driving assistance that’s more precise and, above all, more intuitive.
Dietmar Offenhuber, Horst Hörtner, Andreas Jalsovec, Christopher Lindinger, Robert Praxmarer, Robert Abt, Wolfgang Ziegler, Reinhold Bidner
Dance is one of humankind’s oldest forms of expression. “Apparition” is the outcome of dance’s pairing with state-of-the-art media technology.
This extraordinary dance and media performance came about through the collaboration of theatrical producers, choreographers and developers of cutting-edge creative technologies. The dancers are outfitted with interactive sensors and tracking technology that lets them interact with and influence the surrounding visuals and music.
Klaus Obermaier, Scott deLahunta, Hirokazu Kato, Christopher Lindinger, Peter Brandl, Jing He
Interactive motion-based feedback for children with special needs
Children with handicaps involving their central nervous system get support from an innovative project entitled Motionscapes that aids and encourages them to creatively interact with their surroundings.
Motionscapes is based on real-time gesture recognition, which means that bodily movements are captured and audiovisually interpreted. Young patients suffering from impairment of their faculties of perception are thereby provided with a possibility of self-reflection. In an interactive environment, they discover their own mode of interaction with their surroundings and learn to intentionally influence them. The children’s movements are recorded by a camera and interpreted according to patterns of change in sounds and images. There are five Motionscape scenarios with different variations of forms and colors as well as modes of interpreting the movements: Twirl, Fluid, Loops, Stripes and Bounce.
Motionscapes allows caregivers to quietly stand by without becoming a part of the system; thus, the patients aren’t unsettled by the absence of someone they trust, which is especially important in the case of patients with profound and multiple learning disabilities. Moreover, even the most subtle motions are registered, so the system is also suitable for children with a limited range of motion.
The possibilities and resources of future users were taken into account in determining the system’s technical requirements; accordingly, standard PC components and Windows input methods can be used. The user-friendly user interface allows the caregiver to customize the system to a wide range of therapy situations.
The "Motionscapes: Interactive Motion-based Feedback for Children with Special Needs” project was conceived and executed by the Ars Electronica Futurelab and Zachary Lieberman. It was commissioned by Land Design Studio (UK) and subsidized by the NESTA Learning Award.
The story of this project began with the astounding ways in which autistic kids reacted to “Kaleidoscope,” a 2001 media art installation in the Playzone designed by Ars Electronica for London’s Millennium Dome. In 2001, with the support of Land Design Studio and the BBC, “Kaleidoscope” was installed at Chadsgrove School, a facility specialized in training physically handicapped children, where comprehensive evaluations confirmed the success of this approach.
Zachary Lieberman, Golan Levin, Ars Electronica Futurelab