Protocol and Control in Distributed Networks
The Internet is a global distributed computer network underpinned by protocol, ‘a set of technical procedures for defining, managing, modulating, and distributing information throughout a flexible yet robust delivery infrastructure.’ [1] Although not designed specifically for warfare, the Internet emerges from American military technology of the 1950s and 1960s. Its origins can be traced to the Advanced Research Projects Agency (ARPA) set up by the Defense Department of the United States in 1958, the aim of which was the development of technological military superiority over the Soviet Union in response to their launching of the first Sputnik satellite. In the early 1960s amid concerns about the ability of the United States’ telecommunications systems to withstand nuclear attack, Paul Baran at RAND corporation (and independently of Baran’s work, Donald Davies at the British National Physical Laboratory) developed packet switching, a revolutionary new communications transmission technology. Baran proposed the development of a communication network that would allow several hundred major communications stations to ‘intercommunicate with one another’ and ‘operate as a coherent entity’ after an enemy attack.’ [2] Having identified a wide range and variety of networks Baran concluded that they all factored ‘into two components: centralized (or star) and distributed (or grid or mesh).’ [3] For Baran the significant difference between centralised and distributed networks focused on the extent to which each was capable of maintaining viable communication channels following targeted assault on military telecommunications infrastructures. Centralised networks are configured with a single central node that hierarchically controls and commands all activities, and as such are the most vulnerable under attack: ‘destruction of the central node destroys intercommunication between the end stations’ [4]. Perhaps somewhat surprisingly, a decentralized network is also a hierarchical structure; being merely ‘a multiplication of the centralized network’’ [5] in which the destruction of just a small number of nodes can destroy communication. [4] The distributed network differs to both the centralised and decentralised network having ‘no central hubs and no radial nodes. Instead each entity in the distributed network is an autonomous agent.’ [6] This distributed communications network is independent of central command and control and can remain operational even after a number of its components have been destroyed.
This research was later incorporated into the development of an interactive computer network known as ARPANET developed by one of ARPA’s smaller departments, the Information Processing Techniques Office (IPTO). By 1973 a number of these distributed networks were being developed and scientists began to explore the possibilities of linking them together in a network of networks. For computer networks to interact with each other standardized communication protocols are needed. The ultimate goal of Internet protocols is totality: to accept everything, no matter the source, sender, or destination. [7] This was achieved with the development and refinement of the transmission control protocol (TCP) and the inter-network protocol (IP), together forming TCP/IP, upon which the Internet still operates today. A few years later there followed a proliferation of networks linked together through gateways. In 1985, five supercomputer centres were built in the USA and the National Science Foundation established a ‘backbone’ network to connect them; with the later involvement of networks within Europe and Asia, a global network of networks was established. Following the decision taken by the Defense Department in the 1980s to make Internet technology commercially available, TCP/IP was widely adopted, and in the early 1990s the Internet was privatized. However, as Manuel Castells suggests, ‘[t]he current shape of the Internet is also the outcome of a grassroots tradition of computer networking’ [9]. The sharing of UNIX source code and the communities that developed around it contributed to the emergence of the open source movement: ‘a deliberate attempt to keep access to all information about software systems open’. [10]
The World Wide Web, developed in 1990 by Tim Berners-Lee at CERN, extended this architecture through protocols such as HTTP, HTML and URI, enabling users to retrieve and contribute information across a consistent interface. The Internet is thus a centreless structural form that ‘resembles a web or a meshwork’ [11] and follows a ‘contrary organizational design’ [12] to the bureaucracy and hierarchy of centralised structures. Independent of central control, it is nonetheless underpinned by protocol, which ‘functions largely without relying on hierarchical, pyramidal or centralized mechanisms; it is flat and smooth; it is universal, flexible and robust.’ [14] Protocol is not concerned with the content of communication, but with the facilitation and maintenance of exchange between nodes. A distributed network operates without central hubs or radial nodes to organise communication; it is ‘characterised by equity between nodes… and a general lack of internal hierarchy.’ [16]
The Internet promotes cooperation, collaboration, participation and sharing, whilst being rigidly controlled by protocol. As Alexander R. Galloway suggests, there is an explicit tension between freedom and control: the openness of network architecture and the ethos of open source culture coexist with highly standardised systems of regulation. For protocol to enable distributed communication between autonomous entities, it must enforce homogeneity and standardisation. It must be, in a certain sense, anti-diversity. [19] What appears as openness is therefore inseparable from the constraints that enable it.
This logic of control can be further specified through the work of Michel Foucault and Gilles Deleuze. For Foucault, power is not located in a central authority but operates through dispersed arrangements of practices and knowledges that structure the field of possible action. Protocol can be understood in precisely these terms: not as an external imposition on the network, but as an immanent set of technical conditions that render communication possible while simultaneously regulating it. To participate in the network is already to conform to protocol, which governs not by dictating content but by delimiting the forms through which content may circulate.
Deleuze extends this analysis by identifying a shift from disciplinary societies, organised around enclosed institutions, to societies of control, characterised by continuous modulation across open systems. In such a context, control is no longer exercised through fixed hierarchies but is distributed, adaptive, and infrastructural. The distributed network exemplifies this condition. Its architecture removes central command, yet embeds control within its operational logic through protocol. Control is thus not opposed to distribution; it is realised through and internal to it.
It can be seen then that, in conflating hierarchical structures with authority, many artists and theorists initially assumed that distributed networks might resist control. However, as Galloway argues, distributed networks ‘produce an entirely new system of organization and control’ [24] which, while incompatible with pyramidal power, is equally effective in regulating behaviour. Protocol is ‘how control exists after decentralization.’ [26] It is synonymous with the network itself: a relational system that prescribes the parameters of exchange.
If protocol is both the condition of possibility for networked communication and the primary mechanism of control within it, then resistance cannot take the form of simple opposition. As Galloway suggests, opposing protocol directly is akin to opposing gravity: it is not impossible, but ultimately ineffective. [29] Instead, the question becomes how protocol might be engaged, redirected, or reconfigured from within.
For Galloway and Thacker, this suggests the possibility of “counter-protocol”: practices that do not reject the network, but operate immanently within its logic, exploiting its vulnerabilities, asymmetries, and points of indeterminacy. Protocol derives its authority not from a transcendental source but from its implementation—how it is written, adopted, and enacted. This shifts the site of political and aesthetic intervention from the level of representation to that of infrastructure.
For artists, this presents a distinct challenge. If there is no outside to protocol, then artistic practice cannot simply use the network as a neutral medium, nor can it claim resistance through participation alone. Instead, it must engage with the conditions of networked exchange themselves. This may involve designing alternative protocols, subverting existing ones, or deliberately misusing them in order to expose their limits. Such practices do not escape control, but render its operations visible, contingent, manipulable, and open to further iteration.
In this sense, the politics of distributed networks lies not in their structure alone, but in the ongoing negotiation of the rules that govern them. To “become a better enemy,” as Galloway and Thacker suggest, is not to stand outside the network, but to inhabit its logic more strategically—to intervene at the level where control is exercised most effectively: within protocol itself.
Notes
1. Thacker, “Foreword: Protocol is as Protocol Does,” xv.
2. Baran, “On Distributed Communications Networks,” 2.
3. Ibid. [3]
4. Ibid [3]
5.. Galloway, Protocol: How control exists after decentralization, 31.
6. Baran, “On Distributed Communications Networks,” 3.
7. Galloway, Protocol: How control exists after decentralization, 29.
8. Ibid., 42.
9. Castells, The Internet Galaxy: Reflections on the Internet, Business and Society, 12.
10. Ibid.,14.
11. Galloway, Protocol: How control exists after decentralization, 5.
12 Ibid., 3.
13. Thacker, “Foreword: Protocol is as Protocol Does,” xv.
14. Galloway, Protocol: How control exists after decentralization, 317.
15. Ibid., 33.
16. Ibid., 317.
17. Ibid., 29.
18. Ibid., 2
19. Ibid., 142
20 Martin Hardie, Post Fordist TV posted to Nettime mailing list, 24th May 2005 |
https://www.mail-archive.com/nettime-l@bbs.thing.net/msg02746.html (last retreived 5th July 2020)
21. Lovink and Schneider, “Notes on the State of Networking,” posting to Nettime mailing list, 29 February, 2004
https://www.nettime.org/Lists-Archives/nettime-l-0402/msg00099.html (last retrieved 5th July 2020)
22. Galloway and Thacker, “The Limits of Networking,” posting to Nettime mailing list, 24th March 2004
https://www.nettime.org/Lists-Archives/nettime-l-0403/msg00090.html (last retrieved 5th July 2020)
23. Ibid.
24. Galloway, Protocol: How control exists after decentralization, 318.
25. Ibid.
26. Ibid.
27. Galloway, Protocol: How control exists after decentralization. 85. Ibid., 147.
28. Ibid., 142.
29. Galloway, Protocol: How control exists after decentralization, 142 88. Ibid., 121.