www.aec.at  
Ars Electronica 1995
Festival-Website 1995
Back to:
Festival 1979-2007
 

 

The Noise of the Observer


'Peter Weibel Peter Weibel

I. INFORMATION AND ENTROPY IN PHYSICAL SYSTEMS
Modern statistical information theory has its roots in thermodynamics. The relation between information and entropy as "missing information" (L. Boltzmann, 1894) begins with Maxwell's famous demon. In "Theory of Heat" (1871) J. C. Maxwell writes:
"One of the best-established facts in thermodynamics is that it is impossible in a system enclosed in an envelope which permits neither change of volume nor passage of heat, and in which both the temperature and the pressure are everywhere the some, to produce any inequality of temperature or pressure without the expenditure of work. This is the second law of thermodynamics, and it is undoubtedly true as long as we can deal with bodies only in mass, and have no power of perceiving or handling the separate molecules of which they are made up. But if we conceive a being whose faculties are so sharpened that he can follow every molecule in its course, such a being, whose attributes are still as essentially finite as our own, would be able to do what is at present impossible to us. For we have seen that the molecules in a vessel full of air at uniform temperature are moving with velocities by no means uniform, though the mean velocity of any great number of them, arbitrarily selected, is almost exactly uniform. Now let us suppose that such a vessel is divided into two portions, A and B by a division in which there is a small hole, and that a being, who can see the individual molecules, opens and closes this hole, so as to allow only the swifter molecules to pass from A to B, and only the slower ones to pass from B to A. He will thus, without expenditure of work, raise the temperature of B and lower that of A, in contradiction to the second low of thermodynamics." (1)
Maxwell offered no definite rejection of his demon. In 1912 M. von Smoluchowski offered a partial solution to the problem. He introduced an improved version of the demon. A simple automatic apparatus such as a trap door would be hindered by its own Brownian movement to act as an effective demon: "As far as our current knowledge goes there is then, in spite of molecular fluctuation, no automatic, continuously active perpetuum mobile, but such a device might well function regularly if it were operated by intelligent beings in an appropriate way …" (2) From then on, hypothetical intelligent beings were being referred to as demons and L. Szillard was soon to investigate their function more closely, They apparently have the ability to defy the second law of thermodynamics. The question was, do such beings obey the same laws as all other material systems?

Paul Ehrenfest investigated this question more closely by comparing these intelligent beings with humans, as Smoluchowski had done previously. Ehrenfest, in a letter of 1927, compared Albert Einstein and his attempt to find a loophole in the consistency of quantum mechanics, to "a little devil in the box" who wanted to play "at a perpetuum mobile of a second order", "in order to break through the inaccuracy relation". (3) The intelligent beings were thus being identified as internal observers.

The decisive identification, however, had already originated with Ludwig Boltzmann, whose work on statistical physics of 1094 made him the first to relate the concept of information to entropy, and to define entropy as "missing information", which one might measure as the number of alternatives still open to a physical system, after all the macroscopically observable information relating to it has been recorded. This already points to the model of Claude Shannon's definition of information as a logarithm of the number of choices present. A situation with two possible choices contains, as we know, a "bit", or binary digit, of information. Sixteen alternative messages characterise four bits of information, since 16 = 2 to the 4th power.

The relation between information and entropy was first formulated explicitly in 1929 in Leo Szillard's renowned paper "On the Decrease of Entropy in a Thermodynamic System by the Intervention of intelligent Beings". (4) In this treatise Szillard defined that quantity which today and since the time of Claude Shannon has become known as information. as the amount of free energy used when an observer learns through an experiment which of two seemingly equal alternatives is being realised. One bit of information is equivalent to kln2 units of entropy. From this Claude Shannon was able, in 1948, to derive his famous formula for measuring information, expressed in terms of entropy: H = – 1 pi tog pi where p results in the number of possible choices. (5) The thermodynamic cost of a measurement and of the information gain were apparently clear. Around 1950 it was considered proven that in each act of observation energy up to a maximum of kTln2 was being employed. John von Neumann and Brillouin assumed that in each act of information processing a minimum kTln2 of energy was being used. (6) Thus, for instance, in a 1949 address, Neumann said that, "a computer operating at temperature T must dissipate at least kTln2 of energy per elementary act of information, that is, per elementary decision of a two-way alternative and per elementary transmittal of one unit of information". (7)

Yet this concept of energy-use and information proved naive and, in part, incorrect. In 1961 Rolf Landauer was able to show that the process, which in reality used minimal but unavoidable amounts of energy, served to destroy information. (8) Only in information destruction do irreversible thermodynamic costs arise (as opposed to the reversible costs in information gain). Also the transmission of information, e.g. of a bit from one place to another, did not require kTln2 of energy, On the contrary, Landauer was able to show that here, too, the thermodynamic cost of energy transmission, if done slowly, occurs with an arbitrarily minute, i.e., negligible energy dissipation. (9) To rescue the second law of thermodynamics (the law of the preservation of energy) no (minimal and unavoidable) dissipation of energy is required in information gain and the transfer of information from the object that is to be observed, but rather after the reconstitution of the observer's condition after the transfer, i,e., after the information has been destroyed. The accent on the thermodynamic cost is shifted, after Landauer, from the observation and measuring to the re-establishment of the premeasurement situation, that is, to the cost of extinguishing information, and with it, history.

Precisely at this point another epoch-making paper appeared in the form of Charles H. Bennett's "Logical Reversibility of Computation"' of 1973. (10)

Bennett constructed an "enzymatic Turing machine", where every computation could be transformed to a reversible format by accumulating the history of all the information that would ordinarily be thrown out, only to rid oneself of this history within a process which was the obverse of the one which had created it. The computation was transformed into a series of steps where each step was logically reversible, which in turn permitted its physical reversibility. Computation could thus occur with an arbitrarily small dissipation of energy.
In 1982, Edward Fredkin developed a billiard ball-model of computation as an example of a reversible computer.(11) The collisions of billiard balls can simulate any logical function, and hence also any digital computation.

This billiard-ball collision realises a 2-input, 4-output function of logics: A and B, B and not A, A and not B, A and B. The values I and 0 are represented by the presence or, respectively, absence, of the billiard-ball on a given trajectory. With these cellular automatons of a reversible type he described the first explicit model-universe capable of being computer-simulated. This universe consists exclusively of information. As soon as A has been realised in some concrete form (with various forms of hardware being conceivable) its properties are fully established. It begins to produce autonomous "material" properties internally – e.g. collections of hundreds of black pixels, which stabilise at a certain size and then mutually attract each other like elementary particles, with a well-defined theorem somewhat like Coulomb's. It is Fredkin's hope that some day all natural laws as we know them will emerge as implications from a single such reversible cellular law of automation. The only deciding factor being that one must have the luck to hit upon the correct reversible local rule.
P. Benioff succeeded at about the same time (1981 – 1982), in setting up a reversible quantum-mechanical model of computation and information, thereby combining Hamilton's model with a Turing machine. (12) In his work, "Maxwell's Demon, Szillard's Engine and Quantum Measurements“ (13), W. H. Zurek summed up the results and transferred Szillard's thought-experiment to quantum mechanics. The measuring device then becomes the demon, increasing the entropy. This entropy can be passed on by the demon (the internal observer) to the environment. The environment, then, pays for the entropic cost of the measurement. The information gained by the observation or the measurement has to be balanced out by way of an increase in the entropy of the measuring device.

The relations between the entropic cost of information and the environment are also addressed in the work "Entropy Cost of Information" by Paul N. Fahn. (14) The second law of thermodynamics, then, is a theorem of entropy-balance which states that, if within a system there is an increase of entropy, then in another, linked system entropy will decrease. As a thought-experiment, Maxwell's Demon has shown up some paradoxical problems in these theorems. It was Szillard's one-molecule demon that brought the term "information" into the debate. Since then, a theory of the entropy-cost of information has been developed, a theory of the correlation between information and entropy, which in recent times has been expanded by Landauer and Bennett to include a theory of computation. (15) Benioff, Feynmann, Zurek and Rössler have brought quantum physics and chaos theory into play, as did their predecessor, J. v. Neumann. (16)

The job of the demon is to transform entropy into information, while the information-erasing operation changes information back to entropy. These are the two sides of an interaction between an information-processing machine (the demon) and a classical thermodynamic system. (17) It was R. Landauer's idea to define the demon as an information-processing machine, or, in other words, as a computer. (9) Maxwell's Demon became a computer-controlled device that interacted with the gas. The Boltzmann-entropy of the gas was reduced at the expense of the entropy-enrichment of its own informational content. Erasure of a bit of information requires a minimum amount (kTlog2) of heat dissipation into the environment. Thus entropy reduction only occurs as long as the demon continues to gorge itself more and more with information. The question of the entropic cost of information, first raised by Szillard, was more precisely investigated by Brillouin, Landauer and Bennett. To Brillouin and Bennett, measuring operation and erasure, accumulation and annihilation of information, are thermodynamically expensive operations, since they increase entropy. Landauer cited the relationship to the environment as a way out of this dilemma, Paul N. Fahn (14), too, took this third path in his calculation of the entropic cost of information. To him, neither measurement nor erasure are, in principle, expensive thermodynamic operations. But the de-correlation of the system from information increases entropy in the system-cum-information, thereby increasing entropy in the universe, unless the information is used to reduce entropy elsewhere before the correlation disintegrates. The thermodynamic cost of information rises to the degree in which it is not being used to obtain work from the observed and measured system. De-correlation between information and system is, therefore, the actual entropy-producing occurrence.

Modern communication theory does not solely refer to thermodynamics and statistical mechanics, however, its advance can also be found in the field of electric communication, in the transmission of signals through electric currents. After F. B. Morse's invention of telegraphy in 1832, which involved the transmission of messages through the presence or a longer or shorter absence of an electric current, questions immediately arose relating to the limits of the speed and precision of signal transmission. External currents are always present, which interfere with and disturb the signal being transmitted, thus impeding the differentiation between alternative signals. The disturbances caused by such currents, which were called "noise", clearly needed to be reduced as far as possible. Harry Nyquist published some of the first important mathematical contributions towards modern communication theory, "Certain Factors Affecting Telegraph Speed" (1924) and "Certain Topics in Telegraph Transmission Theory" (1928), in which he showed how the speed of signal transmission could be increased and also introduced the logarithmic function as the comparative measure of information. R. V. L. Hartley, in his "Transmission of Information" (1928). gave a first formal definition of information, which he viewed as a sequence of symbols: H = n log s where H represents the information of the message or the logarithm of the number of possible symbolic sequences, n stands for the number of chosen symbols and equals the number of symbols available.

During the war the subject of noise became more pressing than in peace time, as it became necessary to correctly interpret "noisy" radar data, for example. Devices were sought that could filter out the noise-signals. A. N. Kalmogoroff and Norbert Wiener provided the solutions to these problems. In the same year (1948) that Wiener published his book "Cybernetics", Claude E. Shannon published his famous article "The Mathematical Theory of Communication" in the Bell Systems Technical Journal. (The same journal, incidentally, in which Nyquist and Hartley had also been published, a fact Wiener referred to in his introduction). Shannon placed particular emphasis an the effect of noise in the communication system and in the information channel, as one may glean from his famous diagram of the communication system. The reason for this was that the semantic aspect of communication is irrelevant to the engineer's view of communication, as the latter's fundamental problem of communication simply consists in how a message selected at one point can be reproduced exactly or in as close an approximation as possible at another point.

It was the afore-mentioned Nyquist who named the electric fluctuations causing the heat "Johnson noise" or "thermal noise", after their discoverer, J. B. Johnson. This "noise" is a particularly simple, universal and unavoidable noise which sets natural limits an signal transmission systems. The noise is thus added to each signal. Each message is disturbed by noise, be it during transmission or at the receiving end: Once the signal has been received there always remains an undesirable uncertainty, i.e., noise regarding what the message sent really was. Shannon introduced additional observers who would correct the deviations caused in one way or another by noise between the data sent and the data received.

Shannon developed a number of methods to define the channel capacities of a "noisy channel", which simply has its limits in entropy or statistical uncertainty. H = – [sigma] pi log pi is the entropy of the amount of the probabilities pi… . pn Shannon's observer, who sees both that which is sent and that which has been received distortedly on account of the errors caused by noise, notes the errors and transmits the data via a "corrective channel" or an "error-correcting code" to the receiver who will then correct the errors, If Hy(x) is the amount of additional information required per second to correct the message received, then we can define a limit of channel capacity for channels with noise. A discrete channel would have a channel capacity C and a discrete source would have the entropy H per second. If H [less than] C, then there is a code, so that the output of the source via the channel can be transmitted with an arbitrarily small frequency of errors. It is thus assumed that there may be an ideal observer who could correct errors, as well as the noise of the information source or the information channel. By reducing our uncertainty about the condition of the system the message reduces the thermodynamic entropy of the system, The reduction of entropy, however, augments the system's free energy, which is proportional to the minimum energy required to transmit the message that led to an increase in free energy. The price one has to pay for information regarding one's own system and leading to a reduction of the (thermodynamic, statistical) entropy of a system is proportional to the entropy (based on information theory) of the signal source that produces the information. The price is always as high as it needs to be to avoid a second-order perpetuum mobile, so as not to contravene the second law of thermodynamics.

Entropy is a measure of chance and of disintegration. The tendency of physical systems to be ever less organised, and to increasingly fall apart, is associated with entropy. The arrow of time, the irreversibility of time comes about as a result of entropy. Within the theory of communication based on information theory, information is defined as the number of available choices. If a situation is sufficiently highly organised, there are few available choices, the degree of chance is low and the system is pre-determined – hence there is little information. A chaotic (deterministic, non-linear) system, therefore, in contrast to a purely deterministic system, contains more information. since it has more degrees of freedom, available choices, incertitudes. Thus freedom of choice, entropy and information, defined as a logarithm of the number of available choices, all converge as concepts. The greater the freedom of choice, the greater the information, and the greater also the uncertainty. Noise, however, equally means increased uncertainty, so that one might mistakenly assume that increased noise means heightened uncertainty and hence increased freedom of choice, i.e., information. This is, of course, paradoxical. One thus needs a process that distinguishes desirable uncertainty (information) from undesirable uncertainty (noise).

This task should be carried out by the channel capacity or by the ideal observer.
Noise, therefore, threatens information in several ways. The classic communication theory of information theory or cybernetics firstly simplified the problem of noise by excluding semantic problems, and secondly. viewed it naively, for example, by interpreting the observer not as a source of errors but as a corrector of errors. In a way, it represents a partial retrogression to the time before the thermodynamic theory of entropy. The approaches of quantum physics and chaos theory to information and entropy, as derived from thermodynamics, appear to me the most promising for neutralising the paradoxes and aporia of the theories of entropy and information, as exemplified by Maxwell's Demon, Szillard's machines etc. because they place the problem of the observer at the centre of attention. The noise of classical communication theory is more or less the noise of one's own signal, where the observer acts to correct errors. The noise in quantum physics is the noise of the observer, unavoidably and necessarily producing errors.

Goedel's 1931 work "On Formally Undecidable Propositions of Principia Mathematica I" (18) was the first proof of the unavoidable incompleteness or uncertainty of a system, the information about a system, the self-assertions of a system about itself or about its own condition, when viewed from the inside. In elementary number theory there are, according to Goedel, propositions that are true but cannot be formally proved. In the universe of numbers there will always be things we won't know. Gregory Chaitin universalised Goedel's results of 1931 and Turing's holding problem of 1936 (19) by proposing a thermodynamic, statistical-mechanical approach to mathematics, which claims a chance structure for some areas of arithmetic. (20) From Goedel's result and Boltzmann's statistical mechanics he developed an algorithmic theory of information or "thermodynamic theory of knowledge", which does not exclude uncertainty and chance, because there are areas of arithmetic where problems cannot be solved by drawing logical conclusions because these areas are governed by chance. Uncertainty, the lack of predictability and information, and chance, are thus omnipresent principles not only in pure mathematics, but also in classical physics and in quantum mechanics. Shortly after Goedel had introduced his famous proof of the incompleteness of arithmetic (when regarded at from the inside), his friend Neumann began to investigate the question of whether perhaps quantum mechanics might present a similar limitation – this time, within a physical context. Fortunately, Neumann was able to show that, if quantum mechanics is accepted as the basic theory of physics which comprises all other possible theories as special cases, then there is indeed no reason to worry. For the structure of quantum mechanics guarantees that "the informedness of the observer regarding his own condition" is excluded from formalism. (16) The question of the observer or rather, the noise of the observer who both generates information and at the same erases it, was introduced by quantum physics in order to calculate the entropic cost of observations or information. Otto Rössler's endophysics, developed from about 1980 onwards, is a possible explanation of quantum mechanics, heightens the problem of the observer by its distinction between an internal observer, to whom only certain aspects of the world are accessible and for whom the rest of the world is distorted in a manner incorrigible and unrecognisable to him, and an external observer, who, however, as a kind of super-observer, can only be construed within the confines of model worlds. The world is only ever defined at the interface between the observer and the rest of the world, Thus, the observer's position, is a regulator that can be moved on a frequency between paradise (information) and hell (error). Information is therefore unavoidably observer-relative. Of necessity the observer creates noise., He can escape this noise of observation only by himself becoming a part of the information model. Similar to the theorem proposed in 1964 by John Belt, on remote effects via arbitrary distances, and the existence of non-locality and indeterminism, where information becomes accessible to us via (statistical) correlations, so too, the noise of the observer can only be resolved by remote correlations.

Observation by an observer is, therefore, no longer sufficient to increase information; rather, what is required is an increased correlation and co-variance of observers and observations. It is questionable, however, whether we can grasp these correlations.
II. INFORMATION AND ENTROPY IN SOCIAL SYSTEMS
If we wanted to consider, as we have done up to now, the natural sciences, as the key science of the modern world, it may perhaps be permissible to transfer the problems of information, of entropy and noise, from physical systems to social systems, and there, too, inquire after the relative relationships between information, entropy and observation. Information theoretical communication theory has neglected this question of energy dissipation and the problem of the observer. Quantum physics has acquainted us with the fact that in observing systems and objects we must not dismiss the role of the observer. Niels Bohr promulgated the famous theory that the act of observation in turn influences the very object of our observation. Archibald Wheeler went even further by saying that a phenomenon is a phenomenon only if it is also an observable phenomenon. Here, the informedness of the observer is of central importance. A condition noted by the internal observer is different from that which "objectively exists" and can be observed from the outside. Quantum Demon therefore describes the problem of the noise-generating observer within information systems.

What quantum theory has described for physical systems can also be applied to social systems. Here, too, the deciding factors are the informedness of the observer, his knowledge of his own condition and a distinction as to whether he is an internal observer who is a part of the observation system, or an external observer outside the system he is observing. The theories of quantum physics on the dependency of a system's information level on the observer are also valid for social and cultural systems. A quantum theory of cultural theory is sorely needed. We must part with the traditional historical notion that there is a pure and objective description of the occurrences in the world of the mind, where the observer's contribution to the phenomena under observation can be disregarded or subtracted. We must take leave of this cliché and this illusion. For, on the contrary, in the world of the media in particular, Wheeler's Theorem applies that only an observed phenomenon is a true phenomenon. Only what is represented in the media also exists. and the form in which it exists in the data space equally depends on the position of the observer. Thus, the critic and the theoretician of culture act, willy-nilly, as real-life observers. The observed object's own signal becomes inseparably mingled with the observer's own signal or noise.

In a nutshell, this would be an information theory based on quantum physics, which might be more appropriate to the practice of a trade in information and works, their placement and displacement, their publication and suppression in the post-industrial information-based capitalist society, than is the classic idealist theory, where the influence of the observer (critic, curator, theoretician, editor) on the matter being observed, and on the information, which is only actually constructed and codified by means of the act or observation, has been denied or neglected. Information and the observer can no longer be divided . The noise of the observer, the indeterminacy-relation between information and observer is not arbitrarily reducible. In the present world, in which, from medicine to economics, access to information and the spread of information are gaining an ever-more fundamental and central importance worldwide, the above-mentioned limitations are particularly noteworthy, since quite obviously there is a danger; firstly, of mistaking noise for information and, secondly, of not eliminating this noise with any increase in the amount of information, but of increasing it, in accordance with the theorem of quantum- and endo-physics, where the internal observer does not know that he is an observer and takes his own noise for the information from the situation under observation.

The attempts of sociologists, from Harold Laswell to Walter Lippmann, (21) to analyse the origins of information in social systems, have not been very successful. Not until the theses of Noam Chomsky and Niklas Luhmann, where the noise of the observer plays a constitutive role, do tentative explanations arise of the imaginable impoverishment of the information dimension which currently dominates the public domain where information concerning cultural and political transactions remains inaccessible. The "manufacture of consent" (22) is what is generated today by the noise of the observers, and the current level of information handed out by the mass media. The entropic cost regulates and dominates the information market of the Western world. Entropy is the measure of the mass media. Will the digital data-highways become a part of this information bottleneck, of this entropy; or will they form further correlations and covariances between the observers, as would be necessary? Will the telematic society at last try to counteract its suppressions of information, through the correlations and co-variances of the observers via worldwide networks? The postmodern society is information-based. No longer do mechanical machines support the social servicing system, but information machines, such as computers, do the job. The dogmas of the information society are: there is more information than ever. Information is generally more easily accessible than ever. Information is being exchanged more than ever. Do these bytes for the soundbite-generation make humanity any cleverer or any more knowledgeable? Does the information intake of each individual person actually increase or doesn't, rather, more information than ever get lost? Is, in fact, the exchange of information being decreasing? Don't people, experts aside, know less about one another than ever? Isn't an information implosion and an information bottleneck developing in the digital datanets? In the age of multiple media, infotainment, knowledge software, edutainment, and the data-highways, Ars Electronica 1995 puts critical questions to the myths and dogmas of the postmodern information-oriented society.

Postmodern society consists of very complex, dynamic social systems within which the idea of information plays a central role. The exchange of data in the network of information machines supports the social servicing system from medicine to tourism, from the running of the economy to leisure time activities. The theory of information has become a key science. The spread at information through the mass-medial can, however, also become a part of the arsenal of repressive and optimising strategies of those in power. The exchange of data can flare up in a data war.

Just as centuries ago, with the aid of atlases and meridians, new territories would be measured and devised, discovered and construed, so too, the global data networks represent a new, if virtual, geography. The discourse of cyber-culture has expanded to the datahighways. We no longer merely inhabit streets and buildings, but also cable channels, telegraph wires, E-mail-boxes and, thus global digital dependency. It was in 1969 that, for the first time, four computers were linked in a network system referred to as the ARPANET (Advanced Research Projects Agency-NET). The name INTERNET, therefore, is used to describe any link-ups of those computers that communicate with one another via a protocol, such as TCP (Transmission Control Protocol) or IP (Internet Protocol). In 1972 this project of the American ministry of defense was presented to the public, and many universities and other research institutions joined the net. In 1990 the Internet consisted of some 3000 local networks with more than 20.000 computers. By 1994 their number had reached 2.5 million. Bill Gates expects some 20 million network-linked households and institutions by the year 2000. The information system World Wide Web (WWW) was developed at the European CERN-laboratory (by Tim Berners-Lee), and due to its hypertert-linkages (so-called hyperlinks) represents the most flexible tool within the Internet.

The global data-nets must not be understood merely as multi-media data banks and communication channels, where texts, images, and sounds are transported and processed. These data-networks also permit new forms of communication and also new communication partners, such as, for instance, communication with software agents equipped with artificial intelligence, thus not unlike prototypes of subjects without a physical body. Forms of communication become possible between real people in virtual spaces and between virtual people in real spaces. The noise of the observer and the communication partner can be employed constructively and alter the structure. and thus also the message of communication, at each individual location. One-dimensional communication between two partners with two different interpretational worlds is broken up into multi-dimensional communication with multiple interpretive means. This loss of mutual control and definitiveness can be experienced as a form of liberty.

There will come a time in the realm of the public media with giant cataclysms of exploding errors, with accelerated wanderings of galaxies made up of prematurely incinerated information-dust. Our concepts of information, communication and observation will alter radically and also affect the social systems as we know them. Our political systems will be subjected to radical transformations on the basis of democracy or become accomplices of the monopolies and totalitarian systems ("wired democracy", computer democracy, telecracy, videocracy). In particular the artistic net-projects of Ars Electronica 95 will lift the curtain worldwide for the first time to allow a glimpse at this horizon of the digital data-highways, which were up to now presented to us more or less as phantoms of the media. The financial and human costs as well as the strategies strategies of a networked society will be critically questioned. How will the inhabitants of the net live in this wired world? What will be the price of information and communication within these networkworlds? Who will be the hitch-hikers and hi-jackers on the superhighways of information? Diving through digital conduits and netsurfing an a sea of data will bring about new forms of social contact, ranging from telematic reconstruction of the body to individual acquisition of data -monopolies. Ars Electronica 95 gives critical and euphoric experts an opportunity to appraise this brave new net-worked world.

Thanks to Otto E. Rössler for inspirations


(1)
J.C. Maxwell, Theory of Heat. 4. Ed. Longmans. Green and Co., London 1875, S. 328-329. back

(2)
M. v. Smoluchowski, Experimentell nachweisbare, der üblichen Thermodynamik widersprechende Molekularphänomene. Physik Z. 13, S. 1068-1080. 1912. back

(3)
Paul Ehrenfest, Brief an Samule Goldsmit, George Uhlenbeck und Gerhard Dieke, Nov. 1927. In: Niels Bohr. Vieweg, Braunschweig, 1958, S.152. back

(4)
Leo Szillard, Über die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. Z. f. Physik 53, S. 840-856,1929. back

(5)
Claude E. Shannon, The Mathematical Theory of Communication. Bell System Technical Journal. Juli, Okt. 1948. back

(6)
L. Brioullin, Maxwell's demon cannot operate: Information and entropy 1. J. Appl. Phys. 22, S. 334-337, 1951. L. Brioullin, Science and Information Theory. 1956. back

(7)
John von Neumann, Lecture in 1949. In: Theory of Self-Reproducing Automata. Arthur Burks (Ed.), Univ. of Illinois Press, Urbana, 1966.S.66. back

(8)
Rolf Landauer. Irreversibility and Heat Generation in the Computing Process. IBM J. Res. Der. vol.# 5, 1961, S.183-191. Wiederabdruck: H.S.Leff/A.F.Rex (Hrg.). Maxwell's Demon. Princeton Univ. Press, 1990, S. 188-196. back

(9)
Rolf Landauer, Information is Physical. Phys. Today, vol. 44, Mai 1991, S. 23-29.
Rolf Landauer, Computation. Measurement, Communication and Energy Dissipation. In: Selected Topics in Signal Processing. S.Haykin (ed.), Englewood Cliffs: Prentice Halt, S. 188-196. 1959. back

(10)
Charles H. Bennett, Logical Reversibility of Computation. IBM J. Res. Dev., vol. 17, S. 525-532, 1973.
C.H. Bennett, Demons, engines and the second law. Scientific American, 257, S. 108-116, 1987.
C.H. Bennett, The Thermodynamics of Computation. Inst. f. Theor. Phys. 32 (12), S. 905-940,1982. back

(11)
E. Fredkin und T. Toftoli, Inst. f. Theor. Phys., vol. 21. S. 219-233, 1982.
E. Fredkin, Digital information mechanics. Preprint 1983. Digital Mechanics Physica D 45, S. 254-270,1990. back

(12)
P. Benioff, The computer as a Physical System: A Microscopic Quantum Mechanical Hamiltonian Model of Computers as Represented by Turing Machines. J. Stall. Phys., vol. 22, S.563-591, 1980.
Quantum Mechanical Models of Turing Machines that Dissipate No Energy. Phys. Rev. Lett., vol. 48, S. 1581-158,1982. back

(13)
W.H. Zurek, Maxwell's Demon, Szillard's Engine and Quantum Measurements. In: Maxwell's Demon. Entropy, Information, Computing. H. S. Left/ A.F. Rex (ed.), Princeton 1990, S. 249-259. back

(14)
Paul N. Fahn, Entropy Cost of Information. in: Proceedings of the Workshop on Physics and Computation. PhysComp '94. IEEE Computer Society Press, Los Alamitos, Cal. 1994. S. 217-226. back

(15)
Harvey S. Left, Andrew F. Rex (Hrg.), Maxwell's Demon: Entropy, Information, Computing. Princeton University Press 1990. back

(16)
John v. Neumann, Mathematische Grundlagen der Quanten Mechanik. Berlin 1932, Kapitel 5.
R. Feynman, Quantum Mechanical Computers. Opt. News, vol. 11 (2), S. 11 -20, 1985.
Otto E. Rössler Endophysics. In: J.L.Casti, A Karlquist (Hrg.). Real Brains, Artificial Minds. North Holland, N.Y. 1987, S. 25-46. back

(17)
W.H. Zurek, Algorithmic randomness and physical entropy. Physical Review A 40 (8), S. 4731-4751. 1989. back

(18)
Kurt Gödel, Ober formal unentscheidbare Sätze der Principia Mathematica und verwandter Systeme 1. Monatshefte für Mathematik und Physik, 38, S. 173-198,1931. back

(19)
Alan Turing, On Computable Numbers with an Application to the Entscheidungsproblem. Proc. of the London Math. Soc., 2, vol. 42, S. 230-265,1936-1937. back

(20)
Gregory J. Chaitin, Information, Randomness and Incompletness. World Scientific, Singapore, 1990. back

(21)
Walter Lippmann. Public Opinion. 1921. back

(22)
Noam Chomsky, Eduard Hermanan. Manufacturing Consent, Pantheon Books, N.Y. 1988. back