Radical Atoms: Beyond Tangible Bits

Prof. Hiroshi Ishii, Dr. Jean-Baptiste Labrune, Dr. Hayes Raffle, Dr. Amanda Parkes, Leonardo Bonanni, Cati Vaucelle, Jamie Zigelbaum, Adam Kumpf, Keywon Chung, Daniel Leithinger, Marcelo Coelho and Peter Schmitt (MIT Media Laboratory)

http://tangible.media.mit.edu/project.php?recid=129

Radical Atoms is our new vision-driven design research on interactions with futuristic physical material that can 1) conform to structural constraints, 2) transform structure & behavior, and 3) inform new abilities. Tangible Bits was our vision introduced in 1997 for seamless coupling of bits and atoms by giving physical form to digital information and computation. This Tangible Bits vision guided our series of Tangible User Interface designs such as inTouch, curlybot, topobo, musicBottle, Urp, Audiopad, SandScape, PingPongPlus, Triangles, PegBlocks, and I/O Brush that were exhibited in the Ars Electronica Center in 2001-2004 and other venues. However, we realized a fundamental limitation of Tangible Bits.

Bits in the form of pixels on a screen are malleable, programmable, and dynamic, while atoms in the form of physical matter are static and extremely rigid. This fundamental gap between bits and atoms makes it very difficult to synchronize the digital state and physical state of information we want to represent and control.

Radical Atoms addresses this limitation of Tangible Bits, proposing interaction designs with a tobe-invented physical material that we can “forge” to conform, transform, and inform. This video presents early design sketches of Radical Atoms to go beyond Tangible Bits, and to make atoms dance as bits (pixels) can do today.

CREDITS: Thanks to Neal Stephenson, Chris Bangle, Banny Banerjee, Chris Woebken, Jinha Lee and Xiao Xiao for their discussion and contribution. Thanks also to the Tangible Interfaces class taught in Fall 2008 at the MIT Media Laboratory.


Drawdio: Turn Almost Anything into a “Theremin”

Jay Silver

http://drawdio.com

Imagine you could draw musical instruments on normal paper with any pencil (cheap circuit thumb-tacked on) and then play them with your finger. The Drawdio circuit-craft lets you MacGuyver your everyday objects into musical instruments: paintbrushes, macaroni, trees, grandpa, even the kitchen sink…

Drawdio brings to life the everyday interconnections between people and environment, encouraging you to use your sense of touch, and letting you hear otherwise invisible electrical connections by creating, remixing, and playing. Make one on your own, order a kit online, or order the fully assembled circuit and use it to invent and perform.

Supported by the LifeLong Kindergarten of MIT’s Media Lab. Help from lots of people, but a special thanks to Eric Rosenbaum, Jodi Silver, and the great universe itself.


PicTouch: a Tangible Interface for Art Restoration

pictouch

Leonardo Bonanni, Maurizio Seracini, Xiao Xiao, Matthew Hockenberry, Bianca Cheng Costanzo, Andrew Shum, Antony Speranza, Romain Teil, Hiroshi Ishii.

http://pictouch.org/

We rarely get the chance to experience a precious work of art as a restorer does: touching, repairing it, and occasionally leaving a trace. Pictouch is an exploration of the painting as a living and material thing, not just a static image. Using large touch-sensitive displays, it becomes possible to not only gaze at an im­age but engage directly with its canvas, stretching it, scraping away at the layers of paint, and occasionally even tearing it. Pictouch was inspired by the practice of art diagnosticians using medical imaging equip­ment to photograph paintings at invisible wavelengths of light. These infrared, ultraviolet and x-ray pho­tographs reveal initial sketches, the order in which paint was deposited as well as varnish and subsequent changes or restoration attempts. Pictouch gives users intuitive access to these multiple layers through physical metaphors on a large touch display. This interface is being developed as a way to make modern diagnostic imaging of priceless artwork available to the museum going public and online through a new generation of touch-based interfaces.


g-stalt: gestural interaction, telekinesis, the hand, and cartoons

g-stalt_2_1
Jamie Zigelbaum, Daniel Leithinger, Alan Browning, Olivier Bau, Adam Kumpf, Kyle Buza, and Hiroshi Ishii
Jamie Zigelbaum, Daniel Leithinger, Alan Browning, Olivier Bau, Adam Kumpf, Kyle Buza, and Hiroshi Ishii

http://zig.media.mit.edu/Work/G-stalt

“g-stalt” is a whole-handed, gestural interface for media navigation inspired by the promise of technologically enabled telekinesis. Built for Oblong Industries’ g-speak spatial operating environment, g-stalt allows us­ers to view photographic and videographic media collaboratively and concurrently — operating the system with their hands, fingers, and gestures in three-dimensional space. Working with Oblong we have created a vocabulary of over 20 body-hand-finger configurations for g-stalt which together form the initial seed of an embodied, physical language for computational interaction. In the present this grammar enables g-stalt users to search flickr or rapidly navigate and play a library of Tex Avery’s cartoons from MGM (1942-1955). In the future this grammar could do much, much more.


Relief

relief

Daniel Leithinger & Adam Kumpf

Relief is an actuated tabletop display, which is able to render and animate three-dimensional shapes with a malleable surface. It allows users to experience and shape digital models like geographical terrain in an intuitive manner. Therefore, Relief can provide a better understanding of our environmental impact by rendering time-lapse models of erosion and the effects of climate change. The tabletop surface is actuated by an array of motorized pins, which are controlled with a platform built upon open-source hardware and software tools. Each pin can be addressed individually and senses user input like pulling and pushing. The hardware configuration can be easily extended and utilized for form factors different from a tabletop display. Future versions could include expressive digital sculpting tools and music controllers.


Piezing

piezing2
Amanda Parkes & Adam Kumpf

http://web.media.mit.edu/~amanda/piezing

Piezing is a garment which harnesses energy from the natural gestures of the human body in motion. Around the joints of the elbows and hips, the garment is embedded with piezoelectric material elements which generate an electric potential in response to applied mechanical stress. The electric potential is then stored as voltage in a centralized small battery and later can be discharged into a device. As a concept, Piezing explores a decentralized and self-reliant energy model for embedded interaction, pushing forward possibilities for mobility.


Interactive Wallpaper

interative_wallpaper_final1
Leah Buechley

This project examines the intersection of arts and crafts inspired decoration, electronics, and compu­tation. We are investigating how a single surface (wallpaper) can be statically lovely, discretely interac­tive, and strikingly dynamic.


Computational Sketchbook

compsketch_final1
Leah Buechley

What interfaces might we build if we could sketch functional sytems directly on paper? What will circuits look like when they are painted or drawn instead of etched or machined? This project explores the poten­tial of paper-based computing.


LilyPad Arduino

lilypad_final2
Leah Buechley

LilyPad Arduino is a constrution kit that enables people to design and build their own electronic textiles or “e-textiles”. It consists of a set of sew-able electronic modules and a spool of conductive thread. E-textiles are constructed by sewing the modules to cloth with conductive thread, which provides both the physical and electrical connections between pieces.


Ubiquitous Sensor Network Navigator

portal3
Alexander Reben, Mat Laibowitz, NanWei Gong, and Joseph Paradiso

The Ubiquitous Sensor Network Navigator provides a directable portal through which an observer can inter­act with a sensor-rich environment at the Media lab. Data is rendered intuitively, affording the observer a con­figurable and dynamic view into what is occurring. Sensor data isrendered at various scales of resolution (from high-level overviews to audio/video feeds and from real-time streaming to cached and temporally compressed data). In this way, visitors in Linz can be linked intimately into the professional pulse and daily life at our Laboratory.

The aural gateway integrates sound sampled throughout the Media Lab into a visual, aural and tactile experi­ence. Real time audio data drives the experience allowing the observer to experience the overall activity of the Media Lab. Sound is streamed from 45 nodes distributed throughout the Lab and concentrated into the gate­way. The observer can interact with the piece, feeling the rhythm of the Media Lab. Projected onto the surface is content gleaned from various sensors distributed throughout the Lab. The piece reflects the essence of the Media Lab as an holistic entity.


little Bits

3
Ayah Bdeir

http://www.littlebits.cc

littleBits is a growing library of preassembled circuit boards, made easy by tiny magnets. All logic and circuitry is pre-engineered, so you can play with electronics without knowing electronics. Tiny mag­nets act as connectors and enforce polarity, so you can’t put things in the wrong way. littleBits has a vision: to end the mysticism around engineering and electronics, to counter the black box product ideology of consumer electronics, and to fuel an explosion of creativity and innovation in artists, de­signer, kids and hobbyists when it comes to technology. Crave creativity? Make something! light it, push it, turn it, twist it, bend it, buzz it, blink it, shake it!

supported by Eyebeam Center for Art and Technology

design by Luma Shihab Eldin. www.lumaeldin.com

interns: Jie Qi, Axel Esquite, Youngjin Chung, Paul Rothman

thanks: Robert Vlacich, Spencer Russel, Smart Design


Les Années Lumière

3strip_2
Ayah Bdeir

http://www.ayahbdeir.com/work/installations/les-annees-lumiere/Text

Starting from the assassination of Prime Minister Hariri on February 14th 2005, till May of 2008, Lebanon underwent over 133 days of intense violence. Looking from above, each explosion was as an ephemeral but recurrant bright light across the land. Les Années Lumière is a bird’s eye view of a little over 3 years of violence, strife, and very bright lights rocking Lebanon, remembered and proportionally replayed in 45 minutes. A thin gold mesh is carved with relief representing Lebanon and tens of LEDs are distributed on the landscape controlled by a microcontroller circuit that serves as our memory of the violence. As electronics are layered and integrated in canvas, a 4th dimension of time is added to a medium that is typically static. In its title, Les Années Lumière asks, are the years approaching 2009 our enlightenment? Could capturing our recent history in a tangible way make our memories less painful? Could a visualization of our memories make our realities harder to replicate?

IIn collaboration with Rouba Khalil. Thank you: Axel Esquite, Eyebeam Center for Art and Technology


Death and the Powers

Powers header_stage_small
Tod Machover (composer and creative director), Robert Pinsky (librettist), Diane Paulus (director), Alex McDowell (designer), with a team from the MIT Media Lab including Peter Torpey, Elly Jessop, Andy Cavatorta, Wei Dong, Noah Feehan, Bob Hsiung, et al.

http://opera.media.mit.edu/projects/deathandthepowers/

Death and the Powers is a new opera by composer Tod Machover under development at the MIT Media Lab. It is a one-act, full evening work scored for an ensemble of specially designed Hyperinstruments, and will include a robotic, animatronic stage – the first of its kind – that will gradually “come alive” as a main character in the drama. The original story is about Simon Powers, a rich, powerful and successful man who wants to go beyond the bounds of humanity. He is the founder of the System, a human organism material experiment which investigates the transduction of human existence into other forms. Reaching the end of his life, Powers faces the question of his legacy: “When I die, what remains? What will I leave behind? What can I control? What can I perpetuate?” As he enters the System, his family must decide whether to join him, and world leaders must decide what to do about the havoc he has left in his wake. A new technique of Disembodied Performance is employed to translate Simon’s offstage performance into an expressively animated stage, consisting of Simon’s library as well as various objects including an imposing Musical Chandelier. The story is framed and complemented by a chorus of “rolling, lurching, and gliding” robots – or Operabots – that attempt to understand the meaning of death. Death and the Powers will be presented in workshop form at Harvard’s A.R.T. Theater in September 2009, and the world premiere will take place at the Monte-Carlo Opera in September 2010.

Under the High Patronage of Prince Albert II of Monaco, with sponsorship from Opera Futurum Ltd.


Hyperinstruments

Machover Music Shaper
Tod Machover, with Adam Boulanger, Mary Farbood, Gili Weinberg, Rob Aimi, Diana Young, Mike Fabio, Marc Downie, et al.

http://opera.media.mit.edu/

The Hyperinstruments Group at the MIT Media Lab was founded by Tod Machover in 1986 and has been devoted to augmenting musical expression for everyone, from high level virtuosi like Yo-Yo Ma and Prince, to the general public (as in Machover’s Brain Opera, which received its European premiere at Ars Electronica in 1996), to children and families, and to promote health and wellness. An overview of recent Hyperinstrument work will be presented, including Music Toys (Beatbug and Shaper) from the Toy Symphony project (2001-2004), a violin Hyperbow designed for Joshua Bell, and an interactive version of the Hyperscore composing software environment, which lets anyone compose sophisticated music using lines and colors. An overview video will feature Hyperinstrument highlights from the past ten years with an emphasis on work in Music, Mind and Health, and a special showing will be made available of Tod Machover’s 2005 Jeux Deux for hyperpiano, symphony orchestra featuring the Boston Pops, and interactive graphics by Marc Downie, MIT Media Lab PhD and winner of a Prix Ars Electronica in 2002.

This work was generously supported by, among others, the SEGA and CSK Corporations (Japan), the Boston Symphony Orchestra, Tewksbury State Hospital and the Massachusetts Cultural Council, and Harmony Line Inc.


Living Window

4D_exhibition
Daniel Saakes

We introduce a new material for dynamic storytelling: living windows. Our material acts as a pixel-addressable display that responds to the environment lighting. Placed in a window, it conveys rich stories based on the time of the day or the season. Our material is completely passive and can be used in a similar fashion to stain glass windows.


Social Garden

sg landscape promo2
John Kestner

http://eco.media.mit.edu/socialgarden/

The Internet supports many great tools for communicating at a distance in order to maintain personal relationships and build social networks. However, these tools rarely help us realize which relationships are strained by lack of attention. Social Garden explores using virtual plants as a metaphor for relationships, encouraging us to tend to our social connections as we do our gardens. By tracking and analyzing communications through email, instant messaging, social websites, SMS, and phone, Social Garden proposes to give feedback on how our relationships are flourishing or wilting, and organizes our social circles. We also explore the garden metaphor as a practical interface to browse and manage conversations and contacts.


Proverbial Wallets

wallet hinge open
John Kestner & Daniel Leithinger

http://eco.media.mit.edu/proverbialwallets/

We have trouble controlling our consumer impulses, and there’s a gap between our decisions and the conse­quences. When we pull a product off the shelf, do we know our bank account balance, or whether we’re over budget for the month? Our existing senses are inadequate to warn us. The Proverbial Wallet fosters an ambient financial sense by embodying our electronically tracked assets. We provide tactile feedback reflecting account balances, spending goals, and transactions as a visceral aid to responsible decision making.


Proximeter: An ambient social navigation instrument

proximeter straight
John Kestner

http://eco.media.mit.edu/proximeter/

Would you know if a dear, but seldom seen, friend happened to be on the same train as you? The proximeter is both an agent that tracks the past and future proximity of one’s social cloud, and an instrument that charts this in an ambient display. By reading existing calendar and social network feeds of others, and abstracting these into a glanceable pattern of paths, we hope to nuture within users a social proprioception and nudge them toward more face-to-face interactions when opportunities arise.


Daydar

clock glamour
John Kestner & Richard The

We all use systems for organizing our cluttered schedules, from the day planner to Getting Things Done. One time-honored method, if messy, is writing to-do lists. Daydar is a framework that makes this process social: Can you learn from the working styles of others? Can you collaboratively create an environment of healthy competition by being aware of your friends’ daily accomplishments? Can this help you to find a better balance between work and play? Within this framework we are experimenting with various systems, both physical and digital, that allow you to monitor your own and others’ productivity, help you to get motivated, and enable you to document and visualize the process of accomplishing whole projects.


Giving Character to Characters

type_richard
Richard The

http://web.media.mit.edu/~rthe/type/

In most applications using digital typography today (e.g., animation or screen display), designers rely on existing typefaces. The possibilities for altering and transforming these typefaces are exploited in many ways. What is currently not explored is another large field of typography: the dynamic, flexible, and organic appearance of handwritten typography or calligraphy. This kind of typography is only brought into the digital sses such as scanning; this is because current file formats for type describe the outlines of the individual letters, not the essence/skeleton/model of a letter that we use when writing by hand. This project tries to explore the possibilities of computational and generative processes to improve and change the visual appearance of typography.


E14½

E141_2
Agnes Chang, Richard The, Jeffrey Warren & the Students of MAS 960

To celebrate the construction of MIT Building E14 that will be the new home of the Media Lab, E14½ is a public installation and participatory performance projected on the facade of the new building. The project seeks to capture the Media Lab community’s vision for our new space and our future directions, taking into consideration the Lab’s role within the Department of Architecture and Planning as well as MIT, and how the new community of E14-E15 will be the center of learning, collaboration, and innovation across all disciplines.


Cartagen

cartagen
Jeffrey Warren

Cartagen is a set of tools for mapping, enabling users to view and configure live streams of geographic data in a dynamic, personally relevant way. Today’s mapping software is largely based on static data sets, and neither incorporates the time dimension in its display nor provides for real-time data streams. Cartagen helps users to analyze and view collected and shared geographic and temporal data from multiple sources. While we live in an environment of real-time and temporally situated information, the mapping tools we have are not adequate for viewing, composing, or using this data. Cartagen uses vector-based, context-sensitive drawing methods to describe data, not merely in terms of lines and polygons, but also with adaptive use of color, movement, and projection. Applications include mapping real-time air pollution, citizen reporting, and disaster response.


The Loom

LoomArs01
Richard The, Agnes Chang, & Jeffrey Warren

The Loom is a participatory public installation: a digital tapestry of the virtual and real-world events in which it is situated. Visitors can “cut out” visual content from the web and collage it onto a public display wall. The Loom encourages collaboration among visitors, who may share a physical space or only a connection through the web. Participants’ contributions will be deposited on top of one another and will accumulate over time, leaving an archaeological timeline or cyclogram of unfolding events. As the content travels to the bottom of the ‘fresco’, it reaches the printer which continuously transfers the composition onto a large roll of paper. The printed scroll is perforated in sections and is draped across the floor so that visitors may take pieces of it home.


MIDE

1-mide-300dpi
Agnes Chang

MIDE is a programming environment that proposes a new way for artists and designers to visualize, edit, and manipulate interactive software. It enables the user to define a conceptual visual representation to complement the traditional text-based representation of the users’ code. The interface is designed to facilitate fluid transitions between different modes of computational representation and different levels of representational detail, with the goal of making software a more accessible and expressive medium for designers and artists.


Public Anemone

anemone-robot
Personal Robots Group

http://robotic.media.mit.edu/projects/robots/anemone/overview/overview.html

Inspired by primitive life, Public Anemone is a robotic creature with an organic appearance and natural quality of movement. By day, Public Anemone is awake and interacts with the waterfall, pond, and other aspects of its surroundings. It interacts with the audience by orienting to their movements using a stereo machine vision system. But if you get too close, it recoils like a rattlesnake.


Cyberflora

blossom1
Personal Robots Group

http://robotic.media.mit.edu/projects/robots/cyberflora/overview/overview.html

This robotic flower garden is comprised of four species of cyberflora. Each combines animal-like behavior and flower-like characteristics into a robotic instantiation that senses and responds to people in a life-like and distinct manner. Delicate and graceful, Cyberflora communicates a future vision of robots that shall intrigue us intellectually and touch us emotionally. The installation explores a style of human-robot interaciton that is fluid, dynamic, and harmonious.


The Huggable

DSC_7653-Edit_2
Personal Robots Group

http://robotic.media.mit.edu/projects/robots/huggable/overview/overview.html

The Huggable™ is a new type of robotic companion being developed at the MIT Media Lab for healthcare, education, and social communication applications. The Huggable™ designed to be much more than a fun interactive robotic companion. It is designed to function as a team member that is an essential member of a triadic interaction. Therefore, the Huggable™ is not designed to replace any particular person in a social network, but rather to enhance that human social network.


Autom

SOG_automs
Personal Robots Group

http://robotic.media.mit.edu/projects/robots/autom/overview/overview.html

Autom™ is a robotic weight loss coach designed to help people who are trying to lose or keep off weight. Autom™ helps by encouraging you to stick with your diet for long enough to create long-term change and keep extra pounds off. Fifteen robots were used in a controlled clinical trial in the Boston area in the summer and fall of 2007. Results showed that people who worked with Autom™ stayed with their diet for twice as long as using a computer or paper-based diet log.


3D Printed Clock

clock_2
Peter Schmitt & Robert Swartz

The 3D Printed Clock results on the idea of ready assembled 3D printed computational mechanisms and relates to research in the field of rapid prototyping and digital fabrication. The clock was modeled in CAD software after an existing clock utilizing a weight and a pendulum to keep track of time. The CAD model was created according to the specifications of the 3D printer assuring sufficient gaps and clearances for the different parts to be separate. In addition support material drainage and perforations were added to allow for the access support material being removed after the print.

The 3D Printed Clock is intended to demonstrate superior capabilities of 3D printing as fabrication process. It should contribute towards a future use of 3D printers to replace injection molding and expensive tooling processes and allow for on demand, customized and “greener” consumer products.


CityCar, RoboScooter, and GreenWheel

Figure4
Smart Cities Group

The models in this exhibit represent recent work from the Smart Cities group. All were produced by creating 3D digital models and then using deposition printers to generate physical models.

Several of the models are of lightweight intelligent electric vehicles – the CityCar developed in collaboration with General Motors, the RoboScooter developed in collaboration with SYM and ITRI, and the GreenWheel electric-assist bicycle. These vehicles are clean, green, and energy-efficient, and they enable new kinds of urban personal mobility systems.

Working models of mechanical devices no longer need to be printed as separate components and then as­sembled. Using the latest deposition printers, as Peter Schmitt’s fully functional printed clock illustrates, they can now be printed in their entirety.


Chameleon Guitar; Physical Heart in a Digital Instrument

Chameleon Guitar3
Amit Zoran, Marco Coppiardi, Pattie Maes

Credits: Video by Paula Aguilera, Electronics layout design Nan-Wei Gong

http://ambient.media.mit.edu/projects.php?action=details&id=58

Natural wood, with its unique grain patterns, is what gives traditional acoustic instruments warm and distinctive sounds, while the power of modern electronic processing provides an unlimited degree of control to manipulate the characteristics of an instrument’s sound. Now a guitar built by a student at MIT’s Media Lab promises to provide the best of both worlds.

The Chameleon Guitar — so named for its ability to mimic different instruments — is an electric guitar whose body has a separate central section that is removable. This inserted section, the soundboard, can be switched with one made of a different kind of wood, or with a different structural support system, or with one made of a different material altogether. Then, the sound generated by the electronic pickups on that board can be manipulated by a computer to produce the effect of a different size or shape of the resonating chamber.

Several resonators were made, using techniques similar to the guitar body, to demonstrate the acoustic possibilities; from wooden acoustic resonators to experimental ones.


Shutters

shutters1
Marcelo Coelho

Credits: Steve Helsing, Analisa Russo, Elly Jessop and Josh Kopin

http://ambient.media.mit.edu/projects.php?action=details&id=43

Shutters is a soft kinetic membrane for environmental control and communication. It is composed of actuated louvers (or shutters) that can be individually addressed for precise control of ventilation, daylight incidence and information display. By combining smart materials, textiles and computation, Shutters builds upon other façade systems to create living environments and work spaces that are more energy efficient, while being aesthetically pleasing and considerate of its inhabitants’ activities.


Moving Portrait

moving_portrait
Orit Zuckerman & Sajid Sadi

http://ambient.media.mit.edu/projects.php?action=details&id=21

The moving portrait is based on a set of black and white portraits, comprising a rich library of photographic sequences. The portrait resides in a picture frame and interacts with its viewers using a variety of sensing techniques (vision, ultrasonic, RFID etc.). The sensing architecture enables the portrait to be aware of view­ers’ presence, identity, distance, speed, and body movements. The cognitive architecture controls the portrait’s reaction, taking into account the viewer’s behavior, the portrait’s mood, as well as memory of previous inter­actions. All of which contributes to a complex, believable behavior. By adding interaction, dynamics, and mem­ory to a familiar portrait we create a different and more engaging relationship between the viewer and the portrait. The viewer gets to know more about the subject and in addition the portrait’s responses are adapted to the viewer’s behavior and to prior interactions with current and former viewers. Just like in real life, where every person reveals a different side of his/her personality to different people and situations, the evocative portrait reveals different sides of its own personality.


Siftables

siftabless2
David Merrill and Jeevan Kalanithi

http://ambient.media.mit.edu/projects.php?action=details&id=35

Siftables aims to enable people to interact with information and media in a physical, natural manner that ap­proaches interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are inde­pendent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to imple­ment any number of gestural interaction languages and HCI applications.


Blossom

blossom2
Sajid Sadi

http://consciousanima.net/projects/blossom/

Blossom is a multi-person awareness system that connects distant friends and family, but reacts to the existing communication means by focusing on background awareness rather than direct communication, and on implicit asynchrony that breaks down notions of reply timeframes implicit in current communication technologies. Digital communication grew out of a need for formal contact. As such, much of it is based on back-and-forth communication, setting expectations about replies and timeframes. In a shared household, however, communication is not always this explicit. There is a level of communication that comes solely from momentary shared experiences.
Blossom provides an awareness medium that does not rely on the attention- and reciprocity-demanding interfaces such as mobile phones, SMS and email. Combining touch-based input with visual and motile feedback, Blossoms are created as pairs that can communicate over the network, echoing the conditions of each other and forming an implicit, always-there link that physically express awareness, while retaining the instantaneous and long-distance capabilities that define digital communication.


Sixth Sense

sixthsense2
Pranav Mistry
Pranav Mistry

http://www.pranavmistry.com/projects/sixthsense/

SixthSense is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information. By using a camera and a tiny projector mounted in a pendant like wearable device, SixthSense sees what you see and visually augments any surfaces or objects we are interacting with. It projects information onto surfaces, walls, and physical objects around us, and lets us interact with the projected information through natural hand gestures, arm movements, or our interac­tion with the object itself. SixthSense attempts to free information from its confines by seamlessly integrating it with reality, and thus making the entire world your computer.


Cherry Blossoms

night1
Alyssa Wright

Cherry Blossoms gives witness to the tragedy of war. The project starts in a backpack outfitted with a small microcontroller and a GPS unit. Recent news of bombings in Iraq are downloaded to the unit every night, and their relative location, to the center of the city, are superimposed onto a map of Boston. If the wearer walks into a space in Boston that correlates to a site of violence in Baghdad, the backpack detonates. A compressed cloud of confetti engulfs the scene, looking like a mixture between smoke, shrapnel and the white blossoms of a cherry tree. Each piece of confetti is inscribed with the name of a civilian who died in the war, and the circumstances of their death. With Cherry Blossoms human loss resonates beyond the boundary of conflict.


Sourcemap

sourcemap-banner9
Leonardo Bonanni, Matthew Hockenberry, David Zwarg, Hiroshi Ishii

http://www.sourcemap.org/

Sourcemap is an open platform for sharing supply chain information through the web. Building on environmental impact databases, it allows business owners, designers and consumers to visualize the provenance of products together with their carbon footprint. Users can create and share the supply chains behind products, events, or even meals. The Sourcemap project is open-source, and actively seeking developers, contributors and companies interested in using this system to understand and optimize the way we make things.