DOPPELLAB: EXPLORING SENSOR NETWORKS THROUGH VISUALIZATION AND ANIMATION

DoppelLab software was developed at the MIT Media Lab in Boston (US). It can work dynamically with real-time data delivered by sensors such as those installed in a private home. DoppelLab uses these data to create virtual worlds that enable users not only to get an overview of what’s happing in the house but also to directly intervene in those events.

The conceptual point of departure of this is the fact that, wherever we go, we always leave behind traces—data that can be gathered, evaluated and stored to memory. The only question has to do with the form in which such data has to be put in order for us to really be able to understand it and the reason why we’re using this information.

To achieve this objective, the project crew’s building on the MIT campus was equipped with lots of sensors that registered every movement in the facility and visualized as moving points of light the constant comings & goings in the halls, offices and lobby.

Such a 3D model of the MIT building will be on display at Deep Space. More or less concentrated groupings of points of light provide precise information about which rooms people are currently using or moving through.
An interactive installation at the Ars Electronica Center makes it possible to step out of the role of non-participant observer, to be visible at MIT and, while still in Linz, to interact with MIT personnel.

MIT Media Lab – Responsive Environments Group (Gershon Dublon, Laurel S. Pardue, Brian D. Mayton, Noah Swartz, Patrick Hurst, Nicholas D. Joliat, Joseph A. Paradiso / US)

Comments are closed.