www.aec.at  
Ars Electronica 2004
Festival-Website 2004
Back to:
Festival 1979-2007
 

 

The Construction of the Network Commons


'Armin Medosch Armin Medosch

In recent years, there have been numerous initiatives to set up free wireless citizens’ networks. These so-called freenetworks (1) use wireless network technology to construct their own independent network infrastructure. A conceptual model and overall approach to a network commons can be derived from the principles and methods utilized in going about it. This is closely related to and constitutes a special case of the knowledge commons. The discussion surrounding the knowledge commons arose as a reaction to an increasingly repressive climate in the late 90s. The purported abuse of freedoms in the Internet through practices like file-sharing led the copyright industries to resort to drastic measures that have ranged from lobbying for draconian legislation all the way to criminalizing users. Additional pressure has arisen from the state's cravings for increased powers of surveillance and a number of other public- and private-sector motives for “taming” the Internet. This raises the danger that, as a result of excessive control, society’s essential interest in the dissemination of knowledge will suffer collateral damage in the copyright wars.

The successes of the free software and open source software scenes in creating a commons dedicated to freely usable software served as inspiration for a growing international community to get actively involved in bringing about a society in which knowledge is free. The centerpiece of the effort is the conflict surrounding intellectual property, free and democratic access to knowledge and to the means of production and dissemination of cultural artifacts. The introduction of the term network commons is meant to bring more depth and breadth into a discussion that has focused until now on licenses to commercially exploit property. In this article, I will show that networks cannot be understood solely as carriers of information; they are also aggregators of options for human actions and activities in a much more comprehensive sense.

The term “commons” (German: Allmende) originally referred to a resource that belonged to the village community as a whole, typically a tract of land upon which any villager’s livestock could graze. Allmende stems from Middle High German and was hardly used in common parlance prior to its revival by the digital debate. In Great Britain and the US, the commons debate has been overshadowed by the orthodox interpretation of the “Tragedy of the Commons.” The tragedy: use of resources by those acting in their own interest is said to automatically lead to their destruction, as Garreth Hardin maintained in his influential 1967 essay. (2) Since then, Anglo American discussions have been characterized by these overtones of an inevitable tragedy, in that Hardin’s position is used as an end-ofdiscussion argument against all forms of emerging collective self-organization. (3) To this can be added the fact that both Great Britain and the US experienced the trauma of “enclosure,” the fencing in (i.e. privatization) of commonly used land during the course of industrialization. Other cultures were spared such a thoroughgoing process of privatization, so that forms of common property or the community usage of property could be maintained considerably longer under certain circumstances and even to the present day in some cases. Thus, the very choice of the concept of “commons” to frame a debate means getting involved with what are to a certain degree elements specific to Anglo-American culture. In full cognizance of this, the concept was chosen anyway for lack of a better alternative.

Just like all analogies drawn between computers and the real world, the conception of the Allmende as village green has only very limited validity. One crucial difference is that any number of copies can be produced from a computer file without thereby destroying the original. The relevant core significance thus lies in the abstraction “common property” or “jointly used resource.” This article will elaborate on the conditions of existence of the network commons using the example of the practices of freenetworks.

Groups like Consume and Free2air in London, Freifunk.net in Berlin and Funkfeuer.at in Vienna suggest a decentralized, self-organizing network model. The elementary units of this network are the individual (wireless) nodes. That could be, for example, someone with an ADSL connection and WLAN Access Point or any local user community with a permanent Internet connection and a local (wireless) network. When these individuals or groups make arrangements with one another and interlink their nodes, they create a larger wireless network—a free data cloud. This network is the result of the joint actions of formally independent participants. All of the physical components of a node are managed by the owners / users themselves. In their internal relationships, these nodes are not dependent upon commercial network structures because they can use a license-free portion of the spectrum for transmission. Within this free wireless network, users enjoy the luxury of relatively good rates of transmission. The arrangement of the communications and the design of applications are done by the users themselves, as is the formulation of fundamental principles or across-the-board conventions.

This free cloud of data can also be described as an Intranet of a grass-roots democratic network cooperative, whereby this Intranet must have at its disposal at least one gateway to the Internet. So then: are the participants motivated completely by altruism or do they also expect some advantages from their actions? The minimalist approach is that by sharing, the bandwidth available to everyone increases and the price declines commensurately. The maximalist approach is that this is a proposed model of how the world could find its way to another mode of dealing with telecommunications—by making it community property and it ceasing to be a commercial ware. The Internet itself is the best reason why 1) the minimalist variant functions, and 2) it is even possible to consider a maximalist variant.

The Internet Protocol Suite, also known as TCP / IP, which has been developed since 1972 as part of a DARPA research project at American universities, has turned into a de facto universal network standard. In line with the principle that research results produced with public financial subsidies should also be made available to the public, the documentation of the individual development steps of the Internet protocols from the very beginning were published in the form of so-called “requests for comments.” As open and publicly accessible standards, the TCP / IP Internet protocols favored the “organic” growth of the Internet because any device can be hooked up to it as long as it abides by these conventions. The Internet did not originate as a centrally planned network but rather as a “network of networks” in which many networks joined together and used the networkspanning protocols. The rapid proliferation of TCP / IP since the ‘70s to its present status of a universal network standard profited from the integration of the protocol stack in Unix and from the possibility that any device and application could be developed on the basis of these standards.

The circumstance that the Internet protocols are free and publicly accessible can be established as the prime condition of the network commons. Additional conditions can be derived from qualities of these standards. They are based on the assumption of a flexible network in which data does not follow fixed, predetermined routes but rather can be broken down into individual data packets and forwarded from node to node until they reach their destination. Thus, the very fact that this arrangement can even be referred to as a network is a function of the intelligence of these local nodes and the maintenance of shared protocols. The qualities of the Internet protocols make possible a highly dispersed, mesh-like network topology. Theoretically, no single computer within this network is in a privileged position, and every node processor fulfills the fundamental function of forwarding data packets for other processors. Every node is thus in principle always a transmitter-receiver, whether on the simple connection level of the Internet or on the level of applications that make interpersonal communication possible. Ideally, the speed with which data is transmitted in this way should always be the same in both directions (symmetrical communication). This evokes comparisons to Brecht’s radio theory, but one must simultaneously call to mind the character of the computer as a universal symbol manipulation machine, so that this two-way principle can be applied not only to one medium but also expanded to include any conceivable medium-thus, so to speak, the combination of radio theory plus convergence squared.

During the phase of Internet utopianism in the 90s, this quality of the Internet protocols that makes possible a highly dispersed network topology often provided grounds for speculations whereby analogies were drawn between the technical decentralization of the Internet and an un-hierarchical grass-roots democratic social order. Such a direct superposition of social and technical qualities has, however, proven to be erroneous technological determinism that often went hand in hand with the fetishizing of technical communications media, whose influence upon political and economic conditions was overestimated. With the collapse of the New Economy around the turn of the millennium, many of these techno-utopian castles in the air went up in smoke too.

In contrast, the significance of the digital commons has clearly expanded from approximately 1994 until today. A key building block of the digital commons is the existence of free software and a licensing system that protects it. The General Public License makes possible the free use of software, permits examination of its source code, its modification, and the conveyance of the software to third parties under the condition that the licensing provisions be abided by. The viral character of the GPL has led to the creation of a growing pool of free software ranging from operating systems to various network services to applications. Many key applications in the Internet can be provided without proprietary software having to be used. In addition to open standards, free software is thus also a condition for the sustainable existence of the digital commons.

Others, inspired by the GPL, have developed additional “copyleft licenses,” which protect not only programs but also specific content such as images, texts and pieces of music. Examples are the Open Content License and the Creative Commons Licenses. A growing number of authors use these licenses to place their creative products at the disposal of the public. An important aspect of this is the fact that both free software and free content break out of the producer-consumer straitjacket. Every reader is a potential writer. The network commons constitutes a special case of the digital commons. On one hand, it is based on the previously mentioned components of open standards and free software. Nevertheless, besides hardware, networks also need a transmission medium. The wireless networks set up according to the WLAN standard take advantage of a loophole in the frequency regulations. The governmental regulatory agencies divide the electromagnetic spectrum into bands, and the use of each of them is reserved for specific wireless technologies and certain users-for example, public TV stations, emergency services and the military. Proprietors of these exclusive rights of usage have a strong economic interest in not ever surrendering them, which is why it currently looks as though there is hardly any more “room” in the spectrum.

A special case is the so-called ISM (industrial, scientific and medical) band, a part of which is the area between 2.4 and 2.5 GHz that is used by WLAN technology. The governments of most countries have rules in effect to free this band from licensing requirements and make it freely available for use by all. The upshot is that there is no guarantee of quality—nobody on this band has any special rights, which can lead to overuse and thus
to interference. But that also means that no one needs to ask permission and this band can be used free of charge. Meanwhile, the experiment in opening up the ISM band to unrestricted general use has been deemed a success. In the US, a lobby has formed under the “Open Spectrum” banner and is demanding the lifting of controls over the entire spectrum. Progress in the area of spread spectrum transmission and “cognitive” wireless technologies would make conventional frequency regulation obsolete and allow spectrum regulation to be left up to the devices themselves, according to American proponents of the open spectrum idea.

In contrast to free software that, once it’s produced, can be copied and distributed at very low cost, free networks require permanent maintenance. This involves, on one hand, the acquisition, operation and upkeep of the equipment used in conjunction with a network commons; on the other hand, there’s also an investment in social self-organization. To even be able to speak of a network means there has to be more than one node-i.e. it’s necessary to establish connections. This process includes finding partners who are willing to get connected and scouting out the terrain since visual (line-of-sight) contact between the node locations (actually, their antennas) is necessary. Furthermore, participants have to come up with rules for the joint use of the network, which is a matter of striking a balance between individual freedoms / needs and the sustainable functionality of the network. The dangers and pitfalls are many. Unbridled file-sharing can bring the best wireless network to its knees. Plus, the more restrictive legal situation with respect to file-sharing raises the question of responsibility for the actions of the network participants. Although, until now, there have (thankfully) been hardly any cases to serve as precedents in dealing with this problem, the question of responsibility and the definition of the boundaries to the WWW in general (and not only with respect to file-sharing) is by no means trivial.

The freenetwork community is responding to these challenges in a number of ways. One segment is putting its money on technological means like authentication (e.g. using the free software Nocat) and bandwidth shaping (assigning a maximum of usable bandwidth to each individual user). Dynamic routing protocols also enjoy great popularity among the technical freenetwork community; they enable wireless networks to configure themselves by recognizing new nodes or the breakdown of nodes, and thus almost fully automate the optimal routing of data packets. The developers of hardware-software solutions like Locustworld in London and MeshCube in Hamburg and Berlin have made significant contributions by having analyzed the needs of the freenetwork scene and developed devices that facilitate the procedure of dynamic linking even for non-experts. Despite tremendous progress in this area, it will still take some time to achieve stable, large-scale, dynamic, mobile, ad hoc mesh networking. Others are going with maximal openness. In their opinion, technological solutions like these will never be completely sufficient and an element of social networking will always play a role.

In 2002, a group of networkers began developing a framework agreement designed to put into place fundamental conventions for the exchange of data in free networks: the Pico Peering Agreement. (4) Consideration was given to the question of what actually constitutes the core of this resource called a “free network,” and the formulators came to the conclusion that it was the willingness to allow others free file transit-you can cut across my “virtual real estate” and I, in turn, can traverse yours. (As previously mentioned, this “parcel of land” metaphor is only partially tenable but suffices in this context). The Pico Peering Agreement regulates the fundamental principles of free data transfer and implicitly describes what’s “free” about free networks (in contrast to a sponsored, free-of-charge network). As with the General Public License for Free Software, the Pico Peering Agreement for Free Networks is meant to be a sort of Good Housekeeping Seal of Approval. The Pico Peering Agreement is an initial approach to establishing a constitution for the network commons, a declaration of basic rights and responsibilities.

As a precondition, the network commons requires the existence of open standards, free software, a freely utilizable transmission medium (open spectrum) and a self-determined set of rules (Pico Peering or an equivalent thereof). An aspect of superordinate importance is the implementation of a network as the result of a process of decentralized selforganization. In contrast to thinking typical of ‘90s Internet utopianism, one no longer proceeds under the assumption that decentralized self-organization is a quasi-automatic function of the nature of technology. Self-organization is conceived as an active process, whereby economically and legally unencumbered participants voluntarily enter into collaborative relationships. This active, willful expenditure of personal energy, time and labor is made on the basis of joint striving to achieve a larger whole that is more than the sum of its parts: the network commons. This social aspect of a dispersed network can be described with reference to theoretical models such as the one documented in “Freie Kooperation” (Spehr 2001). However, such a highly theoretical formalization of the network commons must be put off to the future for the time being.

Two essential motivational skeins can be established at present. On one hand, there is the desire to actively oppose the overcommercialization of the Internet. ISPs and telecommunications firms (especially those in the UMTS cell phone field) acting as guards at the portal to the Web have fashioned Internet access in a way that diametrically opposes efforts to achieve free and egalitarian communication. Inherent in the structure of their offerings, both technically and financially, is the conception of “consumers” who download information from the Internet and contribute little or nothing of their own. By selling asymmetrical Internet access and billing for it according to time spent online or quantity of data transferred, they relegate users to their place as consumers who buy access from a provider that owns the Web and centrally administers it. There’s no place for consumers in the concept of the network commons. The value of the network is not diminished by additional users; instead, it grows in that these users come aboard as nodes in good standing in symmetrical two-way communication. The second motivational skein is nourished by the desire to set up a network on the basis of free cooperation and self-made rules. This expression of personal freedom of will over the medium of technological and social networking is understood as a value in and of itself. Furthermore, many observers assume that such networks borne by a collective longing for a setting of free, self determined communication are necessary over the long term in order to safeguard freedom of expression and ensure free media. A by no means inconsequential side effect is the fact that participative and collaborative action is a way to actively test new and hopefully sustainable ways of dealing with technology. As can already be recognized from initial approaches, these processes can result in the development of alternative future conceptions for communications technology that originate “on the street” (5) instead of in the R&D labs of international conglomerates.

However, for the time being, the network commons is more utopia than reality. The functional examples of network commons are relatively few (in comparison to the current total of approximately 600 million Internet users), and these are widely dispersed geographically. They seem to function best in places where locals have to compensate for a severe lack of infrastructure, for example, in regions where the private sector has neglected to provide suitable connections. The largest know free wireless network in Europe is in Djursland at the northern tip of Denmark, an economically underdeveloped region that the communications industry has practically written off. Such initiatives are also booming in areas of East Berlin affected by the OPAL problem and rural regions in the north of England.

As a rule, though, consumer thinking prevails and people seem to prefer 30-euro-a-month ADSL access to chatting with their neighbors. Society's atomization into individual consumer cells is quite advanced in reality and in the conceptual universe of the host individuals. This all means that there is no avoiding the question of whether the network commons has long-term growth and survival chances if it is conceived as a sort of island amidst a system that is otherwise capitalist through and through, or whether it would not actually be necessary to revamp the entire system. As indicated by personal experiences at conferences and festivals during recent years and months, most members of the freenetwork community are not willing to get involved in a discussion in such a large thematic framework, and are restricting themselves for the time being to implementing tangible, doable utopias using the means at hand. Setting aside large-scale revolutionary schemes certainly can have a thoroughly positive effect over the short term, though it might later lead to a theoretical deficiency. This is why I suggest offering a setting here for these discussions of the conceptual and substantive construction of the network commons.

Translated from German by Mel Greenwald

(1)
A detailed and comprehensive description of the practice, culture and politics of freenetworks can be found in Freie Netze +, Armin Medosch, Heise Verlag 2003 back

(2)
Hardin, Garreth, The Tragedy of the Commons, 1967 back

(3)
Recent research in political theory (Ostrom, 1999) shows that the tragic destruction of community property occurs only under certain circumstances, and the long-term cultivation of community property is very much possible as long as certain behavioral and self-regulatory mechanisms are in place. back

(4)
The currently valid version of the Pico Peering Agreement as well as an account of how it came about is available at http://picopeer.net back

(5)
In the sense of the cyberpunk dictum that “the street finds its own use for things” (Bruce Sterling). back