ERROR – The Art of Imperfection

At what point does an error become a mistake, a fail, and what makes it the celebrated source of unprecedented ideas and inventions? When is an error an oversight and when is it intentional deception, a fake?

An error is a discrepancy from what we expect, a deviation from the norm … but what is the norm and who establishes it? An error doesn’t have to be a mistake; it can be an opportunity!

But how much tolerance can we summon up for such deviations, and is it enough for the leeway and latitude that are necessary to unleash their inherent productive power which can be harnessed for social and economic innovation? Or will we allow ourselves to be misled by the populist rhetoric of fear and social scoring? Observing the current situation, one very quickly gets the impression that something has gone terribly wrong with the Digital Revolution and the 21st century. Millions of people feel that they have been defrauded of their sovereignty over their data and their privacy. Deception and fakery have become realities of everyday life, and influence public sentiment and the public opinion formation process. And hovering above it all is a diffuse anxiety of being left behind by the swift dynamics of development. Was the dream of a beautiful digital world an error, and how can we rescue this dream?

This day and age is characterized by a compulsion to achieve perfection and a seemingly unwavering faith in technology. And amidst this drive to optimize, to increase efficiency and raise productivity, and, in even more instances, merely to enjoy the possibilities that digital technologies and social media place at our fingertips, we put ourselves at the mercy of machinery that does its utmost to make lemmings of digital consumption out of us.

Big Data surveillance takes preventative action upon detecting any deviation from our habitual ways. And it is said that in the future, social scoring will do an even better job of optimizing our behavior and attuning it to social norms and standards. The more the technologies deployed for this purpose are perfected and made more efficient, the tighter our situation becomes. Whoever doesn’t fit in sticks out and gets cut.

But it is precisely this imperfection that offers the greatest potential for new solutions. Our objective should not be optimization, since this is merely a best-possible approach and adaptation to what we can now think and deem correct. Optimization leaves no leeway for the unanticipated, and thus no latitude to recognize and rectify what actually are undesirable developments or to come up with better ideas with which to set forth on alternate courses.

Effectively dealing with errors, risk tolerance and creativity are perhaps the skills that are most important for our future.

How many errors in the genetic sequences of living creatures did evolution have to make until LUCA (last universal common ancestor) became Homo sapiens 3.5 million years ago? And how many errors did Homo sapiens need to learn from in order to achieve our current state of development? And how much poorer in terms of experiences and insights would humankind now be if there had always only been “normal” people and the statistical mean … no other kinds, deviant thinkers, people of different colors, or those with alternative beliefs?

To err is human, it’s said. Could that be why we’re incessantly striving for perfection and steadfastly believe we can attain it with technology and science, and in spite of the fact that there is nothing that we fear more than being eliminated by a world of machinery that functions perfectly well without us?

How can we rethink our very ambivalent relationship to technology as the driving force for configuring our future, and what errors should we perhaps not repeat in the process?

The call for social intelligence is now being juxtaposed to our enthusiasm for the digital world and artificial intelligence. We are propagating the courage to welcome imperfection, since isn’t that quite possibly what will always set us apart from the machines!


Gerfried StockerGerfried Stocker (AT) is a media artist and an electronic engineer. Since 1995 he has been a managing and an artistic director of Ars Electronica. 1995/1996 he developed the groundbreaking exhibition strategies of Ars Electronica Center with a small team of artists and technicians and was responsible for the set-up and establishment of Ars Electronica’s own R&D facility, Ars Electronica Futurelab. Since 2004 he has been in charge of developing Ars Electronica’s program of international exhibition tours. From 2005 on he planned the expansion of Ars Electronica Center and implemented the total substantive makeover of its exhibits. Stocker is a guest speaker at many international conferences and a Visiting Professor at Osaka University of Arts as well as guest lecturer at Deusto University Bilbao. He is also a consultant for many international companies on creativity and innovation management.

Christine SchöpfSince 1979, Christine Schöpf (AT) has been a driving force behind Ars Electronica’s development. Between 1987 and 2003, she played a key role in conceiving and organizing the Prix Ars Electronica. Since 1996, she and Gerfried Stocker have shared responsibility for the artistic direction of the Ars Electronica Festival. Christine Schöpf studied German & Romance languages and literature and then worked as a radio and TV journalist. From 1981 to 2008, she was in charge of cultural and scientific reporting at the ORF – Austrian Broadcasting Company’s Upper Austria Regional Studio. In 2009, Linz Art University bestowed the title of honorary professor on Christine Schöpf.