New Images of Mankind
LifeScience, the theme of Ars Electronica 99, addresses not only the industrial research and science sectors that have grown up around modern genetic engineering and biotechnology, but also the wide-ranging cultural and metaphysical aspects associated with the science of life. At the center of attention, though, are the latest procedures of genetic engineers, and the insights and possibilities to which they give rise.
The field of genetics with which we are familiar—that is, essentially, molecular genetics—is a product of the 20th century. Since Thomas H. Morgan successfully proved the linear arrangement of genes on chromosomes in 1910, this field has opened up in a comparatively short time a body of knowledge and an effective domain whose scope and depth are comparable only to those of digital technology.
But right from the start, there began a process of ideological cooption in the sense of Francis Galton’s eugenics, whose serviceable potential very quickly bore its most dreadful fruit in the form of the ”Law for the Contraception of Genetically Defective Offspring” enacted by Hitler’s Germany in 1933. Handicapped persons underwent forced sterilization in Sweden until 1976, in Japan up to 1995.
Molecular genetic engineering advanced just as rapidly as a bearer of hopes of military and—even more strongly—of racist interests. A story has been making the rounds again recently that Israeli scientists are conducting research on an abstruse ”Arab gene” in order to develop a genocidal weapon: a genetic bomb which releases substances that would kill only human beings of a certain genetic make-up and leave all others unharmed. The South African apartheid regime is also said to have been doing work in this area.
Modern molecular biotechnology, which provides profound insights into mechanisms by which cells develop and differentiate, emerged in the 1960s as a branch of bacterial and viral genetics. Its latest offshoot, bioinformatics, has not only brought about a tremendous acceleration of research, but also constitutes the basis of its industrial application.
Up to now, this has been for the most part a matter of glimpses; scientists still do not understand the complex fundamentals of genetics—for example, cell interaction. And in comparison to the precise, predictable logic of digital machines, the extent to which trial and error is still among the most important methods of generating results in molecular biology is a state of affairs that strikes one as tantamount to bizarre.
With the Human Genome Project, US scientists succeeded for the first time during the mid-’80s in loosening the subsidy purse strings and getting the cash flowing to traditionally-underfunded biological research in amounts that had previously been awarded only to the likes of high-energy physics. One dollar per genetic building block, or a total of $3 billion, was the estimate made then of the financing required to sequence the 3,000,000,000 DNA base pairs making up the human genome.
The motivations were many and various, although the fact that the money came from the US Department of Energy—also the agency usually responsible for developing new weapons systems—has continued to this day to additionally fuel the skepticism expressed toward this project. (For a number of DOE research facilities that lacked a mission in the wake of the Cold War, this project represented a new lease on life.)
Another controversial aspect of this project has been the issue of precisely whose genome was to be sequenced. Indeed, all human beings have 99.9% of their genetic information in common; nevertheless, the fact that 99% of it is identical to that of a chimpanzee provides an idea of the tremendous significance of this 0.1% variation. Within this range is to be found the differences among all human beings on this planet, and it is also a widely-known fact that the quantitative differences between individuals are greater that those between races.
The Human Genome Diversity Project initiated in 1991 to conduct research into the diversity of the human genetic make-up—and which goes about this in accordance with a process of ”informed consent” whereby the subject community, group or individual is provided with disclosure of pertinent facts before agreeing to the research conducted on them—has not only been the beneficiary of much lower funding, but has also been confronted by charges of carrying on a modern form of colonialism striving to further exploit indigenous peoples for the benefit of the industrialized world. Nor is this an unfounded suspicion in light of the massive efforts being undertaken to patent genes and other biological material.
Since many states and peoples have come out vehemently against patenting, the financial commitment of the biotech industry has certainly been reserved, and the project has been increasingly broken up into individual undertakings, many of which are being conducted by private firms beyond the guidelines of ”informed consent.”
Meanwhile, the ”search for the Holy Grail,” as Nobel prize winner Walter Gilbert euphorically characterized it, has assumed industrial proportions. In gigantic facilities, robots sequence fragments of the human genome on a ‘round-the-clock basis.
It is, above all, laboratories in the private sector that are already making fabulous profits. For example, the US firm Genome Therapeutics could sell the entire genome sequence of the Helicobacter pylori bacterium (responsible for gastric and duodenal ulcers) to a Swedish pharmaceuticals company for $22 million.
Mention ought to be made of the work of Bernard Barataud, who, as president of the muscular dystrophy self-help group in France, is pursuing genome research as a chance to find a cure for this disease—in opposition to the generally-accepted medical view that it is absolutely terminal. Together with Daniel Cohen, he founded an Institute for Research into Muscular Diseases which houses not only a laboratory for genome sequencing and analysis, but also treatment facilities as well as a space to facilitate encounters of scientists with patients who refuse to go quietly to their fate.
This initiative is remarkable for its enlightened and democratic way of dealing with science and technology because, beyond merely paying lip service to the confrontation of genetic determinism and social emancipation of the individual, it forces the emancipation of those effected by this science through the appropriation and utilization of its methods and findings.
Intentional interventions in and specific modification to not only the phenotype—that is, the morphological aspect of the body—but also the genotype—the fundamental genetic constitution of an organism which is inherited and subsequently passed on—are the current focal points of fears and hopes.
Even though the scientific boundaries of this development cannot be precisely located even by experts at the moment, the question that arises is not whether, or even when, these steps should be taken, but rather how to deal with their consequences.
If we adopt the view held by many experts in human genetics and medicine who do not regard a cloned embryo as a human life, but rather as an organism for the production of life-saving tissue, and, accordingly, propose the concept of organ banks as a convenient source of human replacement parts, then the question that suggests itself concerns our rights and obligations. If a human being owns and exploits his own clone—and be it only in the partial form of replacement organs—then this is above all a matter of the acquisition of completely new value systems. The questions and doubts about whether we should "do it” are, in light of the fact that it is only a matter of time until it will be done, merely mandatory exercises for Sunday sermons.
In the wake of the violation of the Church ban on dissecting corpses having already led to the soul being driven out of the body centuries ago, and the failure of modern imaging technology to discover a place of refuge for our self in the brain, the formation of our self-awareness will be increasingly externalized and shifted onto the level of social relationships. Eliminated from the confines of the ”metabolic machine” which the—premature?—interpretations of the evolutionary psychologists want to leave us with, we define ourselves not by means of divine providence, fate or karma, but rather in the process of social interaction with our environment.
A parallel development is that the body—long ago declared to be a more or less complex electrochemical reaction chamber—is now stylized as a computerized data bank. That our appearance, our talents, and out physical and mental capacities are pre-formed to an incredible extent at the moment of our conception seems just as indisputable as it is unacceptable. (Perhaps a numerical example can offer some solace: with 3x109 nucleotides and approximately 100,000 genes, only a fraction—at least with respect to quantity—of the development of our brain with its hundred billion neurons can be determined in this way.)
And it is also no longer the ”brain computer”—as the leitmotif of bionic visions in the golden age of AI research—which is said to be capable of finally liberating human beings from their purportedly so limited physical existence; rather, it is finally dawning on proponents of such ideas that, actually, the body itself could be pure information—a high-performance computer whose 3x109 DNA base pairs only have to be properly programmed in order for it to be able to continuously renew and reconfigure its hardware consisting of 20 amino acids.
Beguiled by the simple, linear logic of the sequential von Neumann computer model to which, in any case, not only the atom bomb but also the PC, the Internet, the Y2K bug, and a new economic-political order owe their existence, genes are ascribed with the ”code of life” attribute which in turn feeds the hope of finally mastering the complex processes and reciprocal interactions of life.
Indeed, it is certainly to be assumed that this recourse to a rationalistic, mechanical conception is gone about in an all-too-casual way simply to be able to more easily ignore the unpleasant concerns regarding the circumstances of life we carbon-entities face. Perhaps due to the familiarity of this machine-world imagery, it was just plain forgotten that this is a metaphor. We no longer believe, as Descartes did, that the world functions like a clockwork; we believe that it is a clock. The idea of human being as flip-flop circuit (having two stable states and remaining in one until a signal causes it to switch to the other) corresponds to the traditional world view with which biology perpetuates a concept laid to rest long ago and ignores insights such as those being propagated in this century by particle physics.
Nevertheless, the information model turns into a revolutionary trope—totally in the sense of the idea of the information revolution, whereby software is said to be more suited to competition that hardware, and that the information model as a construction blueprint is more important than the building itself. It would thus follow that it is no longer the atoms but the bits and, correspondingly, the structural information of the nucleotides is of greater importance than the amino acids that constitute them.
The distinction between body and machine with respect to the disappearance of clear boundaries is a problem that is inseparably connected with technological revolutions.
In the transition from the Industrial to the Information Age, it was initially the cybernetic-bionic connection with iron, steel and, later, high-tech metallurgy which, it was said, would enhance or even totally replace the biological components of the body. Finally, it is the nano-robots which are said to be in a position to fulfill this task on the molecular level. Many examples from works of literature and science fiction are familiar to us primarily from related theories having to do with digital culture. But the history of medicine offers comparable narratives as well.
In his biography, the famous surgeon Ferdinand Sauerbruch quotes Pliny the Younger’s account of Marcus Sergius, a Roman who had lost his right hand in the Second Punic War and had a metal one made as a replacement, though the author is silent on the way it functioned and how it was attached. It was not until the middle of the 19th century that Dr. Ballif, a respected surgical technician and dentist in Berlin constructed the first hand which, through the use of a rather complex arrangement of straps, wires and belts, could be manipulated at will by the wearer.
But it would still have to wait until the First World War for Ferdinand Sauerbruch in collaboration with Prof. Aurel Stodola from the Technical Institute in Zurich to succeed in employing the patient’s remaining muscles as a source of power for the movement of the mechanical elements of an artificial hand. Thereafter, thousands of invalids were outfitted with such prostheses. Thus, World War I was not only the father of radio, but of the first industrial cyborg as well.
The cyborg metaphor constructed around machine components’ permeation into organic structures has, in its informational aspects, remained in a state of arrested development in the thinking of the Industrial Age, and, in any case, is no longer adequate to describe what is currently transpiring. This concept already started to become obsolete with the emergence of a digitally-networked reality and the associated ideas of disembodiment and virtualization of spaces in which life is played out and social interactions occur. The frequently-cited vision of Hans Morawetz constituted what was, for many people, a spectacular serialized attempt—but nevertheless one that did not achieve lasting success—which brought disembodiment to a radical point and propagated the separation of the mentality from the limitations of the body through a total mental core-dump—that is, downloading the human mind into silicon-based computer storage media and networks (see: Hans Morawetz, Ars Electronica 1990). Indeed, it did not take very long for the first polemicists to point out that perhaps it was not such a great idea after all to entrust one’s existence to Microsoft-supported networks.
The first step toward an approach based upon the universal calculability of 0 and 1 is the concept of the avatar, the most distinguishing feature of which is its identity that can be freely selected, changed at any time, and, if need be, duplicated at will. It was not until the emergence of this concept that the realities of digital information technology and culture were taken into account through a new definition of the individual. It is no longer an outcome of fate (regardless of whether this was determined by nature or nurture), but rather an aspect of emancipation and the freedom of choice.
The post-human movement’s pipe dream of soon being able to transform human beings into nomadic software entities takes an interesting turn in the emerging interpretation of the human body (or rather, all biological organisms) as wet ware platform. Instead of the transformation of the human spirit into 0s and 1s, it is now the gigantic operation of the Human Genome Project which it is hoped will produce the decoding of the genome as the human operating system. And because we have learned so well over the long years of the digital information revolution to dress up everything in computer parlance, terminology like the ”software of life” and reference to the human genome as the ”human being’s operating system” are already solidly established.
Thus, research and theorizing on the digital revolution have concentrated on the application of mechanical components to and into the biological body—and this with considerable success when one considers the reality of modern medicine and prosthetics (on this issue, see Ars Electronica 98, ”FleshFactor”)—whereas a different mode of going about these tasks has emerged as a result of progress in biology. With ever-increasing frequency, success is achieved by introducing biological components into the mechanical environment. Bioinformatics—in current slang: ”wet computing”—has become an important bearer of hopes in the field of computer technology.
The spectrum extends from innovative bacteriological data storage media with enormous storage capacity and access speed (also see Birge, Robert R, Bioelectrics and Protein-based Devices, ibid.) all the way to Affymetrix’ DNA chips, by means of which it is already possible to simultaneously analyze and digitally process over 40,000 genes and expressed sequenced tags.
Such developments also force us to rethink a characterization as natural or mechanical. The phenomenological distinction defined by so-called common sense between that which grows from within itself versus that which is intentionally constructed according to plan is invalid since, to put forth a very simple comparison, software which evolves within a computer with the help of genetic algorithms would be more natural (because it has not been programmed, but has actually grown) than the genetically manipulated tomato which is a machine (because its fundamental make-up has been constructed according to a plan). Moreover, this conventional distinction is, in the meantime, in danger of becoming a synonym for blindly holding on to hierarchies and systems of power handed down from the past. This is a danger that no small number of the proponents of a radical sociobiological interpretation of modern genetics expose themselves to when they emphasize the genetic determination of society but, conversely, fail to acknowledge the implication that with the accessibility of genetic pre-formation for changes, the justification for existing social conditions which is purportedly in conformity with natural law loses its foundation!
The dividing line between the subjective, socially constructed phenomenological body and the objective, physiological one based on ”hard facts” is also shifted into position between nature and art. In both instances, though, such a separation is already without any basis. In the age of life sciences, this dividing line might well become the protracted site of a bitter conflict, and one taking place at the very latest when the discourse surrounding the consequences of what has been mankind’s most wide-ranging interventions to date into its environment—or, better said, in-vironment—is the next item on the agenda.
Facing such developmental vectors, it is incumbent upon leading edge art to reconsider positions and modes of creativity which it primarily represents through commentary and interpretation.
Contemporary artists should well have taken final leave long ago of the idealization of nature as refuge in which wildness attests to authenticity and truth. But in light of the way artists—in numbers that are practically inflationary—are turning to themes like the body and nature (whereby old reliable views are, as a rule, taken out for yet one more spin around the block), or even a ”new epoch of garden art,” it rather seems that the spirit of comfort is at work.
In the replacement of the distanced perspective of modernism with a new subjectivity, one looks forward to idyllic times. The rose-colored stereo glasses of virtual 3-D worlds have just as important a place among the tools of escape as the brave rejection of reality inherent in dispensing with imitation and likeness in the work of art which, as such, ultimately represents the very thing that its creator actually wanted to do without—an object which affirms the existence of a firmly-established order and intact boundaries in the face of complexity, no matter how alarming it might become, as if the environment were an official authority independent of thought and recognition.
Confronted with these new developmental vectors, the art of the immediate future, however, may well see its significance as being dependent to a very great extent on how well it succeeds today in marking transitions with reference experiences in the process of coming to terms with the increasing obsolescence of boundaries.
This perspective is one that it shares with the human being who exists in a state of confrontation with his new images.