robot – Artificial Intelligence https://ars.electronica.art/ai/en Ars Electronica Festival 2017 Tue, 28 Jun 2022 13:43:24 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.6 F.o.G.—Face on Globe https://ars.electronica.art/ai/en/face-on-globe/ Fri, 18 Aug 2017 12:21:25 +0000 https://ars.electronica.art/ai/?p=1731

Daisuke Iizawa (JP), Shunji Yamanaka (JP), Prototyping & Design Laboratory, University of Tokyo (JP)

F.o.G.—Face on Globe is a concept used to study interactions between humans and artifacts. Most interactive robots are designed to have human likeness in order to make their interactions with people more natural. However, if the quality of the conversation and the user’s expectation of the robot’s appearance do not match, it will in fact have the opposite effect.

There is a psychological phenomenon called pareidolia, where people tend to identify faces in inanimate objects. In order to counter this bias, the artists hypothesized that the sphere is the shape that least resembles humans. They made a prototype to explore the question “how can we use design to balance a robot’s appearance and behavior and the user’s expectations?” Their robot is spherical but can shape-shift in order to give a more or less human-like impression. By controlling its shape they can capture how people’s social behavior changes depending on the robot’s form.

Supported by Japan Shunji Yamanaka Laboratory, University of Tokyo, Japan, and Mitsubishi Electric Corporation

Credits

Supported by Japan Shunji Yamanaka Laboratory, University of Tokyo, Japan, and Mitsubishi Electric Corporation

]]>
Deltu https://ars.electronica.art/ai/en/deltu/ Fri, 18 Aug 2017 07:48:57 +0000 https://ars.electronica.art/ai/?p=1718

Alexia Lechot (CH)

Deltu is a delta robot with a strong personality that interacts with humans through two iPads. Depending on its mood it plays with the recipients.

But if they make too many mistakes Deltu might just get upset and decide to ignore them. Frustrated, Deltu will leave the game and take some selfies to post on Instagram.

Credits

Supported by ECAL, École cantonale d’art de Lausanne

]]>
The Wandering Artist https://ars.electronica.art/ai/en/wandering-artist/ Wed, 16 Aug 2017 15:14:11 +0000 https://ars.electronica.art/ai/?p=1420

Sarah Petkus (US)

The Wandering Artist is a meditation that took place at the European Space Agency on the role that creativity and human expression play in the context of space exploration. A robotic entity was equipped to interact with its environment in personally expressive ways as a catalyst to encourage reflection from scientists and engineers about the purpose and identity of space-faring technology.

NoodleFeet is the functioning robotic manifestation of an illustrated character built from light metal, 3D-printed parts and found objects. Noodle has been developed with mechanical and electronic systems which allow him to exhibit behaviors when stimulated by objects in his environment. His purpose is to exist freely in the world while reacting to situational encounters using self-defining methods of personal expression. Where most technology has a practical or utilitarian application meant to enhance our lives, Noodle is a unique entity who functions without regard to a human’s perception of his purpose or usefulness. The artist’s goal is for this to provoke consideration about the motivation behind humanity’s current innovations. She hopes that those who interact with Noodle will witness a meaningful sense of self from him that will encourage reflection with regard to the value of their own relationship to the technology common in everyday life.

Credits

This project is presented in the framework of the European Digital Art and Science Network and co-funded by the Creative Europe program of the European Union.

]]>
BR41N.IO Hackathon https://ars.electronica.art/ai/en/br41n-io/ Wed, 16 Aug 2017 14:13:22 +0000 https://ars.electronica.art/ai/?p=1395

The Brain-Computer Interface Designers Hackathon

g.tec medical engineering GmbH (AT)

The BR41N.IO Hackathon brings together engineers, programmers, physicians, designers, artists and fashionistas to collaborate intensively as an interdisciplinary team. They plan and produce their own fully functional EEG-based brain-computer interface headpiece to control a drone, a Sphero or e-puck robot or an orthosis with motor imagery.

FRI Sept. 8, 2017

10:00 –10:30 AM Welcome of BR41N.IO Hackers and Introduction
10:30 AM – 11 AM
Intro: Current and Future Applications of Brain-Computer Interface
11 AM – 11:30 AM
Intro: Agent Unicorn – A Fashionable BrainTech, Anouk Wipprecht (NL)
11:30 AM – 12 noon Intro: Steps to run a BCI, Christoph Guger (AT)
12 noon – 12:30 PM
Hackathon Gruppen und Mentoring
1 PM
START HACKING

SAT Sept. 9, 2017

11 AM
End 24-h-Hacking
11 AM – 2 PM
Hackathon project presentations
2 PM – 2:30 PM
Evaluation of projects by the Hackathon jury
3 PM – 4 PM
BR41N.IO Hackathon Ceremony
to award the best projects by Christoph Guger & Landeshauptmann-Stv. Michael Strugl

Whenever they think of a right-arm movement, their device performs a defined action. The programmers create an interface which allows them to control robots and other devices with their thoughts alone. The artists among the hackers make artistic paintings or post and tweet a status update. And hackers who are enthusiasts in tailoring or 3D printing give their BCI headpiece an artistic and unique design. And finally, kids create their very own ideas of an interactive head accessory inspired by animals, mythical creatures or their fantasy.

Inspired by the unique Agent Unicorn headpiece from fashion-technology artist Anouk Wipprecht, the BR41N.IO Brain-Computer Interface Designers Hackathon challenges young geeks to design and build a unique, playful and wearable brain-computer interface (BCI) headpiece. The BCI measures brain activity and enable users to control a robot or smart device, to communicate or paint using just their thoughts.

Twenty years ago, brain-computer interfaces could only move computer cursors. Today, machine learning is one component of BCIs that will be used in many different fields of neuroscience, such as motor rehabilitation of stroke patients, assessment of and communication with coma patients, control of devices for disabled people, cognitive training or neuromarketing. BR41N.IO shows these current and future developments and the unlimited possibilities of brain-computer interfaces in creative or scientific fields, and how artificial intelligence, life science, art and technology become a unity to evolve innovative and exceptional BCI headpieces.

Credits

BR41N.IO is organized by g.tec medical engineering GmbH | Schiedlberg | Austria

]]>
A3 K3 https://ars.electronica.art/ai/en/a3k3/ Wed, 16 Aug 2017 13:53:32 +0000 https://ars.electronica.art/ai/?p=1386

Intermedia/trans-technological performance/installation

Dragan Ilić (RS/AU/US)

A3 K3 is a unique interactive experience. Artworks are created by machine technology and audience participation. Dragan Ilić uses an elaborate brain-computer interface (BCI) system where he controls a hi-tech robot with his brain via state-of-the-art technology.

Members of the audience are invited to try out the BCI technology. The artist and the audience draw and paint on a vertical and a horizontal canvas with the assistance of the robot. The robotic arm is fitted with DI drawing devices that clamp, hold and manipulate various artistic media. They can then create attractive, large-format artworks. Ilić thus provides a context in which people will be able to enhance and augment their abilities in making art. Thanks to the support of g.tec, Dragan Ilić will undertake further research with AI systems/human interaction in the process of making art.

Credits

This program is supported by g.tec and GV Art London.

]]>
Mimus: Coming face-to-face with our companion species https://ars.electronica.art/ai/en/mimus-companion-species/ Tue, 15 Aug 2017 16:58:09 +0000 https://ars.electronica.art/ai/?p=1288

Madeline Gannon (US)

Mimus is a giant industrial robot that’s curious about the world around her. Unlike in traditional industrial robots, Mimus has no pre-planned movements–she is programmed with the autonomy to roam about her enclosure. Mimus has no eyes, however she uses sensors embedded in the ceiling to see everyone around her simultaneously. lf she finds you interesting, Mimus may come over for a closer look and follow you around. But her attention span is limited–if you stay still for too long, she will get bored and seek out someone else to investigate.

Our interactive installation responds to a commonly cited social fear–robots taking over work from humans. The World Economic Forum predicts that robots will take five million jobs over the next five years. However, we believe in a more optimistic future, where robots do not replace humanity, but instead enhance and complement it. Ordinarily, robots like Mimus are completely segregated from humans as they do highly repetitive tasks on a production line. With Mimus, we illustrate how wrapping clever software around industry-standard hardware can completely reconfigure our relationship to these complex, and often dangerous, machines. Rather than viewing robots as human adversaries, we show a future where autonomous machines like Mimus might be companions that peacefully co-exist with us on this planet.

Industrial robots are the foundation of our robotic infrastructure, and have remained relatively unchanged over the past 50 years. With Mimus, we highlight an untapped potential for this old industrial technology to work with people, not against them. Our software illustrates how small, strategic changes to an automation system can take a one-ton beast-of-a-machine from spot welding car chassis in a factory, to curiously following a child around a museum like an excited puppy. We hope to show that despite our collective anxieties about robotics, there is potential for empathy and companionship between humans and machines.

Credits

Madeline Gannon is the founder and principal researcher of ATONATON

Development Team: Madeline Gannon, Julian Sandoval, Kevyn McPhail, Ben Snell

Supported by: Autodesk, ABB Robotics, and The Studio for Creative lnquiry

About the artist

Madeline Gannon (US) is a multidisciplinary designer working at the intersection of art and technology. She leads ATONATON, a research studio inventing better ways to communicate with machines. In her research, Gannon designs and implements cutting-edge tools that explore the future of digital making. Her work blends disciplinary knowledge from design, robotics, and human-computer interaction to innovate at the edges of digital creativity. Gannon is currently completing a PhD in Computational Design at Carnegie Mellon University, where she is developing techniques for digitally designing and fabricating wearables on and around the body.

Lesen Sie mehr auf: starts-prize.aec.at.

This project is presented in the framework of the STARTS Prize 2017. STARTS Prize received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 732019.

eulogos2017

]]>
Hanging Drawbot https://ars.electronica.art/ai/en/hanging-drawbot/ Mon, 14 Aug 2017 12:14:26 +0000 https://ars.electronica.art/ai/?p=1110

Markus Gütlien (DE)

Hanging Drawbot is a drawing robot that sketches lines self-sufficiently and algorithmically. The spectator can interact with the machine, affecting its movements. Ideally, a symbiosis of coincidence, machine and human being should create art through cooperation.

]]>
Hexapod https://ars.electronica.art/ai/en/hexapod/ Tue, 08 Aug 2017 20:28:13 +0000 https://ars.electronica.art/ai/?p=2380

The Hexapod is a spider-like robot. Its research objective is to demonstrate the trainability of biomechanical processes in neuronal networks.

By means of various deep-learning frameworks, thousands of simulations and/or training sets for reinforcement learning are run on the in-house servers at Spengergasse Technical School and the Technical University of Vienna and subsequently tested on the Hexapod. However, to get a clearer picture of its physical surroundings, the Hexapod has recourse to a stereo camera, which puts the information about these surroundings to use. Thanks to object recognition capabilities, the robot can navigate independently in it field of deployment and execute tasks.

]]>
End of Life Care Machine https://ars.electronica.art/ai/en/end-of-life-care-machine/ Tue, 08 Aug 2017 19:51:47 +0000 https://ars.electronica.art/ai/?p=3573

Dan Chen (TW/US)

End of Life Care Machine is an interactive installation consisting of an empty room, a seating area and a reception desk. Signs, medical bracelets, health information forms and other related medical products are used to transform the space into a hospital-like environment, where people go for their final rite of passage.

In this empty room lit with a single fluorescent light is a hospital bed and the Last Moment Robot by the bedside. The robot is constructed as a medical device with a padded, caressing arm and a customized recording device designed to guide and comfort the dying patient. The whole event is carefully scripted.

Viewers of this installation are invited to enter the room one at a time, accompanied by an individual dressed in a doctor’s coat. After the patient lies down beside the robot, the doctor asks permission to insert his or her arm under the caressing mechanism. The device is activated, and an LED screen reads “Detecting end of life.” At this point, the doctor exits the room, leaving the patient alone by him or herself. Within moments the LED reads “End of life detected”, the robotic arm begins its caressing action, moving back and forth, stimulating a sense of comfort during the dying process. The Last Moment Robot takes the idea of human replacement to a more extreme scale. It allows robotic intimacy technology to be re-evaluated.

The work shown here is in prototype form.

]]>
1:1 https://ars.electronica.art/ai/en/11/ Tue, 08 Aug 2017 14:10:07 +0000 https://ars.electronica.art/ai/?p=2617

An Ars Electronica Futurelab Academy @ QUT Project

Jacob Watton (AU), Briony Law (AU), Jaymis Loveday (AU), Charles Hendon (AU)

The 1:1 project is about the relationship between a human and a robot camera—how they grow to be able to imagine each other in complex ways, seeing each other on a 1:1 scale. Incorporating elements of dance, theater, and new technology, this work resides at an interstice between human and Other.

The Ars Electronica Futurelab Academy was created to support students and educators from international partner institutions in transdisciplinary practice. Since 2012, the platform has enabled collaborations between Ars Electronica and universities across the world.

Performance Times

THU 7. 9.
7 PM–7:20 PM
8:45 PM–9:05 PM

FRI 8. 9.
6 PM–6:20 PM

SAT 9. 9.
11 AM–11:30 AM micro performance + artist & team Q&A
6 PM–6:20 PM

SUN 10. 9.
11 AM–11:30 AM micro performance + artist & team Q&A
6 PM–6:20 PM

MON 11. 9.
12 noon–12:30 PM performance + artist & team Q&A
2 PM–2:30 PM performance + artist & team Q&A
4 PM–4:30 PM performance + artist & team Q&A
6 PM–6:30 PM performance + artist & team Q&A

Credits

Supported by: Producer: Lincoln Savage (AU); Assistant Producer: Quinty Pinxit-Gregg (AU); Researcher: Nicole Robinson (AU/UK); Dramaturge: Kathryn Kelly (AU); Roboticists: Marisa Bucolo (AU), David Hedger (AU) and Paco Sanchez-Ray (AU); Project Consultant (dramaturg and choreographic development): Dr. Stephanie Hutchinson (AU)

The 1:1 project was realized through the generous support of QUT Creative Lab, Robotronica and QUT Robotics Lab.

This project has been assisted by the Australian Government through the Australia Council, its arts funding and advisory body.

]]>