Monthly Archives: September 2013

The Question of Documents

by Annet Dekker

— short intro: the following is a draft version and part of my last chapter in my thesis, hence some obvious information or links may be missing. in my presentation i will concentrate on the analysis of several case studies that underlie this more theoretical part. also note that it hasn’t been proofed for english. —

In her seminal text Qu’est-ce que la documentation (1951) Suzanne Briet expanded the notion of the document to also include natural objects and works of arts. Documents were regarded as examples or grouping of things, which derive meaning from their context. This approach is still valid today but it would need to be redefined and clarified, because what happens when the context, for example a distributed network, is the work? Are software and algorithms also documents? Is something immaterial, a process, or a network a document – and if they are not, then what are they?

Similar to Briet, Lev Manovich argues that it is not enough to examine the ‘final’ presentation in order to understand contemporary media; social, historical and technological contexts should be taking into consideration when talking about or identifying documents. However, Manovich uses the term ‘software performances’ instead of documents because ‘it is software which defines the options for navigating, editing, and sharing the document, rather than the document itself’ (2013:34), thereby stressing the construction by software of experiences. The discussion of whether the term document is still useful in a digital age is also brought up by David Levy (1994 and 2001), and others like Michael Buckland (1998), although they don’t come up with a solution, both argue to follow the path of the earlier documentalists (among others Paul Otlet and Suzanne Briet) by focusing on defining a document in terms of function rather than physical format. Although it is striking that Manovich doesn’t refer to documentalists practices, his descriptions and analyses follows a similar approach of trying to answer the question of what constitutes a ‘document’, or in Manovich’s terms, to understand media software. So, in what way is the notion of ‘software performances’ useful, and should it replace the term document? What does performance mean in relation to software? Which aspects perform? For what purpose? For whom?

One of the main characteristics that I use to describe net art is its performative qualities. Net art can be understood as performative in terms of the meanings ascribed to it as well as in terms of the effects of its performance on the movements of data and information in communication networks. The verb ‘perform’ means to act, to carry out an action or pattern of behaviour.[i] In the context of art, perform or the noun performance, is mostly associated with Performance art. Although the term Performance art is a contested concept (Carlson 1996), in general it refers to a performance presented to an audience in which the performer(s) doesn’t present a conventional theatrical play or a formal linear narrative. Phillip Auslander emphasises that in traditional terms it may be problematic to see bots (or technical tools in general) as performers, because such definitions generally emphasize the performer as someone who executes and in that process makes interpretations that lead to specific aesthetic effects.[ii] In order to make his argument he makes the distinction between technical and interpretive skills. When analyzing the installation Listening Post (2002 -) by Mark Hansen and Ben Rubin, Auslander argues that the installation is an example of technical performativity, because it ‘constructs its performances by sampling [live] conversations on the Internet’ (2005:8). Auslander continues that ‘the particular technical skills possessed by Listening Post could not be found in a human performer, for no human being could scour the Internet, gather data, sort it, and display it in real time with the speed and accuracy of the machine’, thereby stressing the speed and accuracy of the technical skills of the computer. The use of digital artworks as examples of performance art and in performance studies is becoming more common.[iii] However the distinction between the technical and interpretative skills is supported in most cases. Although unarguably computers are incapable of human interpretation in the sense of reading between the lines or making assumptions, I’d like to argue that software programmes, especially in algorithmic processes, can perform in complex ways that go beyond a technical narrative as emphasized by Auslander. Such ‘performativity’ enacts what it represents or describes, furthermore connecting performativity with ‘cultures of circulation’, as discussed by Benjamin Lee and Edward LiPuma (2002), opens the discussion to see software performances as creators of the act they refer to. Finally these arguments will challenge the meaning of the term document.

The term performativity derives from British philosopher of language J.L. Austin. In his publication How to Do Things With Words (1962) he describes performative utterances as statements that perform an action: a Speech Act. Rather than describe or report what is being done, they do (1962:5).[iv] At first sight Austin’s Speech Act theory fits the model of computation, which generally breaks down in three stages: input, processing and output.[v] An input into the system does something, physically in the voltages and in the mechanisms of the machine, and computationally in the abstract mathematics of processing.[vi] But, similar as Austin’s theory simplifies the context of language and meaning by regarding it as a ‘total situation’ (1962:52), as also emphasized by Derrida (1988[1972]), there is always uncertainty and ambiguity present in processing. As Arnold Michelson and Allen Levesque argue ‘It is clear from the outset that with any real communication system we cannot expect to receive exactly what is transmitted. At the very least, we can expect noise to be added to the transmission, causing random errors’ (1985:4).[vii] Moreover, leaning on Claude Shannon’s communication model, Susan Ballard explains that information cannot occur when there is no noise in the process (2007).[viii] This means that performativity has always a certain level of unpredictability, uncertainty and ambiguity, or in other words that the input and output are not necessarily coherent.[ix]

Performativity is used by many artists, either actively as in the case of The English performance group Blast Theory by making failing hard- and or software part of the overall performance, or as artifacts of historical instances in the case of Martine Neddam’s (1997 –) by holding on to some errors instead of fixing them.[x] Such performativity of code means that code is not one-to-one reversible, nor can it be seen as pre-set instructions for execution. Performativity of code indicates that execution takes place by thinking through the material. As such, the challenge lies more in the question whether code or software performances also create the ‘act it refers to’. The ‘act’ of is often associated with identity play, Blast Theory with game adventures and Naked on Pluto with addressing privacy issues, but in what way do they also act beyond these meaning-making narratives? How do they engage in and facilitate circulation, one of the main characteristics of net art? Would a focus on circulation and process offer a means to critically address the performativity of net art? In their article ‘Cultures of Circulation: The Imaginations of Modernity’ (2002) Lee and LiPuma propose an alternative version of the concept of performativity. They see performativity as an aspect of circulation, rather than as a central concept of meaning-making. ‘Performativity has been considered a quintessentially cultural phenomenon that is tied to the creation of meaning, whereas circulation and exchange have been seen as processes that transmit meanings, rather than as constitutive acts in themselves’ (2002:192). They continue: ‘Cultures of circulation are created and animated by the cultural forms that circulate through them, including—critically—the abstract nature of the forms that underwrite and propel the process of circulation itself’ (2002:193).[xi] In net art such circulation can be traced by looking at how movement performs in the code, in the interaction between code, programmer and context and how this shapes visitors’ experiences. As well, such circulation and exchange of code involved in the infrastructure of communication may reveal specific power structures.[xii]

Document as process, or process as document

How could the concept of performative circulatory and processes help with the conservation of net art? My emphasis on the processual dimensions of materiality suggests that what something is has to be understood in terms of what it does, how it (historically) works within machinic, systemic, social and cultural domains. In order to understand and critically reflect on the evolution and the political dimensions inherent in computation it is important to study these processes, their behaviour, how they function, and how they are embedded in and influenced by social and technical contexts. Such an approach will also guide conservators to answer what the material is, what the intention of the artist(s) was, and find ways to capture, restore or document net art.

With processes being the work, or seeing the work as a process, Renee van de Vall suggests to speak of a third paradigm in conservation, the first being centered around scientific conservation (or the autographic paradigm) and the second, leaning on Pip Laurenson (2005) around performance and performative behaviour.[xiii] In her view this third processual paradigm can be characterised by artworks that are following ‘rules of the game’, are open-ended, in continuous development, and part of the development of the work is outsourced (either by technical or natural processes, or participants). These artworks unlike performative works are not predefined by instructions or notations and as such Van de Vall makes an analogy with improvised music, and stresses that it is not a matter of one paradigm substituting for the other, but that these approaches can be seen to work in parallel and even at times intermingle. My findings seem to support the division between performative and processual artworks. However, it remains to be seen if such a clear separation is necessary when discussion conservation, or documentation strategies. For example are ‘rules of the games’ (sending something out into the world and let it evolve) the same as a ‘set of instructions’ (there is a margin of variability but not everything goes).[xiv] In most cases there will always be some kind of restriction,[xv] either through the set up of the artwork, for example in the case of, most parts of are still linked or kept together by the main website, and the participants are encouraged to remain within the domain. The game-engine of Naked on Pluto acts in such a way that it is processual, because the game-engine is generative, but the game itself is only partly so, the rules of the game are pretty fixed and not everything goes. The performances of Slub World are probably the closest to the characteristics of a processual paradigm, but even here it could be argued that it is not only generative. Even though it is based on algorithmic processes and that process is the narrative, the human input is very important.[xvi] As McLean describes:

‘In live coding the performance is the process of software development, rather than its outcome. The work is not generated by a finished program, but through its journey of development from nothing to a complex algorithm, generating continuously changing musical or visual form along the way’ (McLean 2011:130).

Looking at degradation of fabrics or other biological material are these also processual? Such works evolve, but there is no sense of (outsourced) participation, with some variations they can be brought back to their ‘original’ state. To sum up, the logics in most works can be analytically different, which is important for understanding and analysing a work, but most artworks have performative and processual elements. So, what could be the consequences for conservation, will these two paradigms need different approaches or strategies? It seems obvious that with processual works conservation in the strict sense will not be possible, but the same could be argued for many performative artworks. In both cases documentation will likely play a more important role than the reconstruction of the artwork.



Auslander, Philip (2005) At the Listening Post, or, do machines perform? International Journal of Performance Arts and Digital Media. Vol.1, nr.1, pp. 5-10.

Ballard, Susan (2007) Information, Noise and et al. M/C Journal, Vol.10, Issue 5.

Briet, Suzanne (2006, originally published 1951) What is documentation? (Qu’est-ce que la documentation?) Translated and edited by Ronald E. Day and Laurent Martinet with Hermina G.B. Anghelescu. Lanham, MC: Scarecrow Press.

Buckland, Michael (1998) What is a ‘Digital Document’? Document Numérique (Paris) Vol.2, No.2, pp. 221-230.

Carlson, Marvin (1996) Performance: A Critical Introduction. London and New York: Routledge.

Derrida, Jacques (1988, English translation [1977] from French [1972]) Limited Inc. Evanston, IL: Northwestern University Press.

Laurenson, Pip (2006) Authenticity, change and loss in the conservation of time-based media installations. Tate Papers, Issue 6.

Lee, Benjamin and Edward LiPuma (2002) Cultures of Circulation: The Imaginations of Modernity. Public Culture. Vol.14, No.1, pp. 191-213.

Levy, David M. (1994) Fixed or Fluid? Document Stability and New Media. European Conference on Hypertext Technology 1994 Proceedings. New York: Association for Computing Machinery, pp. 24-31.

Mackenzie, Adrian (2005) The Performativity of Code: Software and Cultures of Circulation. Theory, Culture & Society, Vol.22, No.1, pp. 71-92.

Manovich, Lev (2013) Software Takes Command. Cambridge, MA: The MIT Press.

McLean, Alex (2011) Artist-Programmers and Programming Languages for the Arts. Ph.D. thesis, Department of Computing, Goldsmiths, University of London.

Michelson, Arnold M. and Allen H. Levesque (1985) Error-Control Techniques for Digital Michigan: The University of Michigan Press.


[ii] Auslander bases his argument on the quote from philosopher Stan Godlovitch who discusses musical performance, ‘interpretive skills involve aesthetic effects for which no obvious quantitative measure exists, and typically emphasize “expression” …’(Godlovitch 1998:54, in Auslander 2005:6).

[iii] See among others, Bay-Cheng, (2010), Giannachi, (2012), Bleeker (2013).

[iv] Austin further distinguishes between an ‘illocutionary act’ that is concerned with what someone/something is doing when saying something, and a ‘perlocutionary act’ that involves the consequence(s) of an utterance. The utterance and the consequences of that utterance don’t occur at the same time. According to Austin in order for the illocutionary act to be successful certain conditions need to be met. However, as pointed out by Derrida (1977[1972]) meaning, nor context, of a text cannot be defined in its entirety – a performative utterance is always intertwined with structures of power.

[v] See among others Charles Petzold (2000) and Ive Englander (1996).

[vi] It may be good to stress that I’m referring here to formal executions; it is not a social performance based on human conventions (as in Austin’s theory). Technologies, in and of themselves, do not bring about cultural or social change.

[vii] In modern computers many processes and redundancies are build in to reduce the effects of noise, making it unlikely that a computational error will occur. Nevertheless the more complex processes become, the more noise comes in which can lead to unexpected or unnoticed events. However even in ‘simple’ systems, like CRT and LED monitors ‘single transmitted voltage might simultaneously perform the one or zero of binary code, disrupt adjacent data with its electromagnetic noise, and be received as radio waves by an external antenna’ (Van Orden 2010), a process that was named Van Eck Phreaking. See also Van Eck (1985) and Kuhn (2004).

[viii] Many artists have used these errors (also referred to as Glitch) to make artwork, for more information see among others Goriunova and Shulgin (2008) and Menkman (2011).

[ix] Live coders explore these characteristics of programming in their live performances. ‘Live coding is the activity of writing (parts of) a program while it runs. It thus deeply connects algorithmic causality with the perceived outcome and by deconstructing the idea of the temporal dichotomy of tool and product it allows code to be brought into play as an artistic process’ (Alexander, 2004:243-244 ). See Yuill (2008) for more information on a historical contextualising of code practices referencing scratch orchestra of the 1960s.

[x] There are also other examples when ambiguity through performativity takes place, for example in the before mentioned use of identity in However, such performativity refers more to the meaning ascribed to performativity.

[xi]  In network theories the new forms of access, understanding and engagement with circulatory networks are explored (Benkler 2007; Castells 1996; Wittel 2001), but little attention has been paid to the dynamics of circulation itself as force of change.

[xii] I’m leaning here on the article ‘The Performativity of Code: Software and Cultures of Circulation’ by Mackenzie (2005) in which he asserts that ‘if we accept that information and communication constitute a central venue for the performativity of some important contemporary forms of power, then the circulation and exchange of software and code involved in the infrastructure of communication could well be analysed in performative terms’.

[xiii] Renee van de Vall (2013) ‘Documenting Dilemmas. On the Relevance of Ethically Ambiguous Cases’, keynote lecture at Performing Documentation in the Conservation of Contemporary Art, Lisbon 20-21 June.

[xiv] The term instructions is used by Laurenson to describe performative artworks, following Stephen Davies she argues that a ‘notation has the function of specifying works. A score is intended as instructions to potential performers and ‘it is by following these instructions that players generate instances of the work’’ (2005).

[xv] Generative artworks can be seen as the exception, for more information see (accessed 9 August 2013).

[xvi] What exactly defines generative art is still being discussed, most of these discussions centre around the human influence on the programme. See among others, Galanter (2003) and Mclean (2011:16-17, 115-127).


Ten Theses on Cassette Tapes, History, and Interface Criticism

By Christian Ulrik Andersen & Søren Bro Pold

Have we reached an end point of the cultural history of computing? When Apple marketed its first Macintosh in 1984, it was with Ridley Scott’s famous commercial. Allegedly, Macintosh would save civilization from the totalitarian Big Brother state of George Orwell’s novel 1984, in the commercial incarnated by mainframe systems and IBM. Three decades later, the table is turning. According to a leaked NSA presentation it is now Apple who is Big Brother, and enthusiastic iPhone customers who are the zombies living in a surveillance state (citation needed). Apple’s iOS and other cultural interfaces for smart phones and tablets are designed as alternatives to an open space of DIY, sharing and remix otherwise characterizing cultural computing. However, in order to control the way people consume and produce cultural software, the platforms’ licensing systems also involve heavy monitoring and censorship, or ‘controlled consumption’ (as we have labelled it elsewhere). In other words, the promise of a digital revolution also implies a reaction where dominant actors remain allegiant to the institutions of intellectual property, as Stuart Moulthrop noted already in 1991. In this way, participatory network culture has been subsumed under a strictly monopolizing business model that in several cases also has been caught in delivering surveillance data to military, state and industry intelligence. The computer, which was originally developed as a military technology but (e.g., by Apple) redefined as liberatory, emancipatory and revolutionary, is now back again where it began: as a military intelligence technology.

No doubt, the computer interface has redefined cultural consumption, communication and the arts, and has grown to become a primary medium for culture and society. This has changed both the scope and depth of the computer interface, and today it evidently is not just a pragmatic tool but it also integrates individual expressions and cultural tastes. However, what strategies of resistance and critique are left in this contemporary totalitarian digital culture? In a “post-digital” era of reaction (rather than revolution), the Jurassic technologies left behind possess a new kind of fascination. In the following, we want to discuss and question a current interest of materiality and history (a ‘media archaeological’ investment in vinyl records, floppy disks, pneumatic tubes, etc.), and ask what alternatives materiality may behold? How are we to perceive the re-investment in historical and otherwise lost materials and platforms?

In the summer of 2013, The Consortium for the Preservation of Cassette Tape presented the CASSETTE MEMORIES, “a media archaeological excavation of the cassette tape and its use – from a human and tape perspective” (a workshop at the leading North European Rock Festival “Roskilde Festival”, initiated by Andrew Prior, Morten Riis and Søren Pold in collaboration with Roskilde Libraries, As the invitation to the event reminds us, cassette tapes are deeply associated with more than a generation’s childhood memories, and the first experiences of recording voices, listening to music and creating mixtapes. But not only does the cassette tape represent our past when found in an old drawer, and brought to the workshop to be tampered with, cut up, and looped in new ways; it also appears as a material. The cassette tape consists of magnetic polyester coated with ferro or chromium oxide, pressed against a playback head. It is characterized by relatively poor signals and advanced systems for noise reduction that would create very different listening experiences, depending on the quality of the tape, and the listening device, be it a Walkman, a mono portable or an advanced hi-fi cassette deck.

So what are we to make of the nostalgia for cassette tapes? Is it wistful longing for a simpler life and faded childhood memories? Is it a hipster-like search for ‘authenticity’ and ways of differentiating? Is it an aesthetic and sonic search for the grainy ‘lo-fi’ sound quality? Rather than beginning by discussing whether it is nostalgia for the music and voices, or the materiality of tapes, we suggest to enlighten the relation between sign and signal (“the interface”, cf. Andersen & Pold. Interface Criticism) by asking how we should consider history in this tale?


In his essay “On the Concept of History” Walter Benjamin writes: “To articulate the past historically does not mean to recognize it ‘the way it really was.’ It means to seize hold of a memory as it flashes up at a moment of danger” (Thesis VI). It seems clear that Benjamin criticizes historicism. We cannot seize hold of the past merely by describing a level that pre-determines a logical course of events. History as ‘the way it really was’ is more ambiguous (as Benjamin’s use of quotation marks also may indicate). In theses, Benjamin explicitly addresses historical materialism, and in continuation of this, we propose to think of the revival of cassette tapes as something else than a revelation of material and technological determination. This implies that it is not merely the productive forces (our tools, instruments, technology, knowledge, etc.) that define our history as a changing mode of production (tribal, feudal, capitalist, etc.) in a simple one-way – techno-deterministic – direction. In other words, cassette memories are not just revelations of how social relations are most fundamentally production relations, and the essential role of the cassette tape in the making of a pro-sumer capitalist system’ (or whatever one chooses to call it). Technology, and the processing of magnetic signals did not make history and did not define our language and social relations in new ways, nor did any other technology. The technology and material production levels are always met with specific cultural interpretations and practices. Likewise, cassette tapes are used through a myriad of practices that still carry potentials.

But then what is a magnetic cassette tape? Along with other productive forces and technologies cassette tapes must be seen as part of the same realm as language, in the sense that also language turns out material (as on a cassette tape), and this material is in itself a speech act (at the workshop people talked about sending their voices to their loved ones across the Atlantic and about the investment and gesture of recording and giving away a mixtape). A qualitative separation of material signal processing and the media representation is therefore futile. In every way, the material of the cassette tape (the playback head, the noise reduction system, etc.) is as much a social and linguistic construct (including DIN and IEC defined standards and protocols for equalization), as it is the physical manifestation of a representation (of a memory, a voice, a recording). This ambiguous double-nature allows for a different kind of critique than the mere re-invigoration of how a participatory technology of reproduction brings about a new mode of production that predetermines our social relations (as product relations in the digital economy’s immaterial labor system).

Benjamin also reminds us what a belief in technological and material determinism may lead to:  “Nothing has corrupted the German working, class so much as the notion that it was moving, with the current. It regarded technological developments as the fall of the stream with which it thought it was moving. From there it was but a step to the illusion that the factory work which was supposed to tend toward technological progress constituted a political achievement.” (Thesis XI) History, as the case of the German working class illustrates, is re-conceptualized. Where the worker’s history was once that of a slave, it is (in Nazi-Germany in 1940) that of liberated grand children.  This re-conceptualized view is not a revelation of history and ‘what it really was’ but just another manifestation of material determinism dressed up by political ambitions. The historical materialist must thus address history differently, as Benjamin puts it: “There has never been a document of culture, which is not simultaneously one of barbarism. And just as it is itself not free from barbarism, neither is it free from the process of transmission, in which it falls from one set of hands into another. The historical materialist thus moves as far away from this as measurably possible. He regards it as his task to brush history against the grain.” (Thesis VII)

How can one ‘brush history against the grain’ in a post-digital era? Instead of a deterministic historicism (in a certain way also potentially present in historical materialism), Benjamin encourages us to think of the cassette tape as something that flashes up in a moment of danger. In the light of history, his use of danger (by a Jew experiencing Nazi Germany) is perhaps even clearer today than at the time. Benjamin uses Paul Klee’s painting ‘Angelus Novus’ as an allegory of history, an angel looking at the past, fixed with open mouth in contemplation: “Where we perceive a chain of events, he sees one single catastrophe.” (Thesis XI) Like Klee’s angel, the role of The Consortium for the Preservation of Cassette Tape is not to enlighten us of a chain of events, but to save the past as critical reflection on a present crisis. It reminds us of the cassette tape as a mode of production before the catastrophe (before it became dressed up by corporate ambitions resembling totalitarian control).

As such, it is not of particular interest that the productive force of a participatory tool, and the cassette tape as a tool for reproduction, has defined a mode of production and a social reality of product relations (participatory labor). What is interesting is the discourse and myths around the technologies that has led us to believe that the employment of technology represents a god given chain of events leading to increased efficiency, and that the maxims of the technology (producing, sharing, mixing, etc.), can create individual freedom and mastery when navigating through social reality (this idea is not unlike Georgios Papadopoulos critique of a totalizing market, (21f.)). Such constructs cannot be addressed as material determinism, but The Consortium for the Preservation of Cassette Tape can lead us to challenge these myths, by exploring a past discourse in the present – as a potential criticism. In this way, the return to old media does not hold an essence. The material turn is realist, in the sense that it expresses an awareness of – not how materials are more real then signs – but of how also our technologies are signs, and our signs technological, and an awareness of how the coupling of signs and material in technology also incorporate a form of control. In this sense, the cassette tape does not hold a truth but is an allegory.

As an allegory, the cassette tape and The Consortium for the Preservation of Cassette Tape’s workshop seizes “hold of a memory as it flashes up at a moment of danger.” It establishes an imaginary correspondence to another historical moment, but not as a yearning for a lost time (to paraphrase a notion of history present in the writings of Marcel Proust). The “post-digital” return to old media is a resistance to the commodification and subsumption of participatory culture, also described by Florian Cramer (by 4 mobile, social and networked media corporations: Google, Apple, Amazon, and Facebook). To Florian Cramer, the corporate templates of a shared space are now challenged by phenomena such as self-made books and zines that become “a form of social networking that is not controlled or data-mined by those companies.” (237)

To return to our point of departure: Mobile media like smartphones and tablets are examples of what can be characterized as the fifth phase of the user interface. The first user interfaces were the technical control panels and switches that were used to operate the first computers. Often, the agenda was related to automation and the computer used for batch processing. Gradually textual interfaces developed including command line interfaces such as DOS and UNIX and real-time interaction became possible. The Macintosh and 1984 mark the stage where the graphical user interface came out of the labs (where it was developed through the 1960s and 1970s by Ivan Sutherland, Douglas Engelbart, Alan Kay, Adele Goldberg and many more) and became an integral part of the personal computer leading to disciplines such as interface design and Human-Computer- Interaction. With the Web and especially Web 2.0 the interface added a communicative, networked, social dimension, and we are now at the threshold of what we will here call the urban interface, where urban space (also extending beyond the geographical confines of cities) becomes progressively mediated by mobile, embedded and ubiquitous interfaces. Simultaneously, all these interfaces collect data and lead to the myths of big data analysis and smart city development.

Unfortunately, the urban interface is currently characterized by the surveillance of a controlled consumption interface, which is coupled with a “war on general-purpose computing” as argued by Cory Doctorow. According to Doctorow, the battle surrounding copyright is extended into locked down platforms or “IT appliances”, which can only run authorized programs in protected sandbox environments, hiding essential parts of the functionality for the user. This, he sees as a big problem for privacy and transparency. As a consequence of this, software culture becomes limited in its potential for developing innovative ways of using and understanding the computer and ultimately developing new forms of software. Furthermore, what is even worse, much of the big data collection and use remains hidden and privatized instead of being discussed in the public sphere and used for common goods.

As an alternative to an interface culture of controlled consumption, and as a post-digital response to a corporate subsumption of a digital revolution, we ask if there are new ways of reconfiguring the fifth interface? If so, this would incorporate an awareness of – just as The Consortium for the Preservation of Cassette Tape demonstrates – how representation is always also material, and of how materials become part of signification processes. In other words, a fifth generation interface criticism.

Works cited:

Andersen, Christian Ulrik & Søren Bro Pold. “Introduction.” Interface Criticism – Aesthetics Beyond Buttons. Aarhus: Aarhus University Press, 2011. Print.

— . “Controlled Consumption Culture.” The Imaginary App. Eds. Paul D. Miller and Svitlana Matviyenko. Cambridge, Massachusetts: The MIT Press, forthcoming. Print.

—. “Controlled Consumption Interfaces.” A Peer-reviewed Journal About 1.2 (2013). Web <>

Benjamin, Walter. On the Concept of History. Trans. Dennis Redmond, 2005. Web <>

Cramer, Florian. “Post Digital Writing.” Electronic Book Review (2012). Web <>

Doctorow, Cory. “The Coming War on General Purpose Computing.” 28th Chaos Communications Congress, Berlin 2012. Keynote. Web <>

Moulthrop, Stuart. Moulthrop, S. 1991. “You Say You Want a Revolution? Hypertext and the Laws of Media”.  The New Media Reader. Eds. N. Wardrip-Fruin & N. Montfort. Cambridge, Massachusetts & London, England: The MIT Press, 2003. 691-704.

Papadopoulos, Georgios. Notes Towards a Critique of Money. Maastricht: Jan van Eyck Academie, 2011. Print.

Trash Versionality for Post-Digital Culture




Following a 14-day visit to the UK, the United Nations’ special rapporteur on adequate housing Raquel Rolnik, issued an end-of mission press statement[1]. The statement included recommendations for the UK’s social housing welfare reform (known to opponents as the ‘Bedroom Tax’). Researched and submitted according to UN protocol (Guardian, 2013a), the advice was however vehemently rejected by the UK government. The rapporteur’s personal and professional credibility were then attacked in the media and elsewhere[2].



As quickly as attention has switched away from this episode, it offers us a snapshot of the media landscape in which trash, in the form of dispensable news and information, is merging with opinion and political rhetoric. In today’s changing topologies, new methods for delivering data are also conduits, returning us to sites which disperse products, in exchange for money and – potentially – free labour. The Formamat[3] - by Kripe, Schraffenberger and Terpstra – investigates the value individuals place on data they have stored on their mobile devices. It is a vending machine, “…which returns candy in exchange for the deletion of [an individual's] digital data”. The authors “…invite people to experience the joy of deletion in a public space and encourage them to think about the value and (in-)dispensability of their files while also researching the subject in a broader sense by storing and analysing their deletion-behavior. ” (, 2013)



After the passage of only two or three years, The Formamat can be seen to capture an ambiguity in our evolving relationship with data; already an unexpected revision can be seen in a new question: not which, but whose files are going to be deleted?. Taken together with the Internet’s long memory (from the Way Back Machine[4] to playfully macabre, assisted Facebook-identity suicides[5]), this observation underlines the attention now being given to choice and ownership of data. As the pervasive and (in)dispensable capacities of data are recognized, we are seeing verification, trust and identity increasingly as matters of concern to us all.


Version Control

These are also issues in Git, the source code management (SCM) software written by Linus Torvalds and designed to address technical and social issues arising in the Linux kernel project. Git is a ‘directory content manager’ (Ubuntu, 2013) which offers revision tracking of changes, so that a project can at any time, be rolled back to a prior state. Git uses a distributed model – where network access is not a necessity. Version control can record discrete visual objects, data constellations and visual images of developer activity (i.e. a project’s organizational underpinning and the inter-relations that go with it)[6]. Issues of governance are also dealt with in creative projects which utilize and discuss version control. Simon Yuill’s Social Versioning System[7] and Matthew Fuller and Usman Haque’s Urban Versioning System 1.0[8] concern the relevance of Free Software principles to consensus and co-operation in design practice.




Ethics and etiquette

Whilst in adoptions of Free Software studied by Shaikh (2012), no particular ‘openness’ can be assumed, Free Speech values have been intrinsic to the development of Free-Libre Open Source Software (Turner, 2006). These influences are evident in the Debian Linux Project’s protocols and in conduct embedded in Wikipedia (Reagle Jr., 2011). Kelty’s ‘recursive geek publics’ (2008) and Warner’s ‘counterpublics’ (1999) may be helpful in understanding the cohesion that exists within these communities. In Wikipedia the interdependence of social and technical apparatus is clear as disagreement is managed across multiple versions in software. Where the Unassailable Viewpoint comes into relation with the Neutral Point of View, ‘archiving evolution’ and ‘adaptive ethics’ systematically aim to prevent damage and encourage discussion. (Heath Cull, 2011)


Beneath the Street, the Network

Such reflexive activities bring images of social cohesion and community integrity into focus. More overtly political formations (such as Anonymous and Occupy Wall Street) envision identities beyond those of public friend or foe. In the encounter between these and primarily online communities, a firmament of politics and identity shows Free Speech and Anonymity connected. Meanwhile, revelations about state surveillance demonstrate that anonymity is not an essential aspect of digital networks. Moreover, networks are typically submerged. Cloud computing and Software as a Service are geared to providing easy to use interfaces for collaboration and sharing, in a way which belies the real availability of data. Ted Nelson’s invocation, “you can and must understand computers now!” (Nelson, 1974) is renewed by under-reporting in the media (Guardian, 2013b).


Abundance and Modification

New platforms allow recursive representations of existing creative forms. Re-versioned political slogans and insider nods to Situationist imagery issue from anonymous channels, are stored at ‘deviant’ locations[9] and are then absorbed into the melee involving Internet memes and personalities[10]. The knowing winks imply this is a party not everyone is invited to (though of course, everyone wants to play!)[11]. The impact of social media – and its reflective potential – receives further validation through public acquisition of artworks (such as The Cybraphon[12]), through Wikimedia outreach projects[13] and in software which measures the public mood via twitter and the blogosphere[14].


Reduction and Overloading

Across diverse networks algorithmic interpretations gather meaning from the mess of communication, using keywords and metrics on an industrial scale:


…linear texts come into being as the result of a gesture called ‘grasping’. Grasping involves a translation from representations into concepts… The result is a conceptual universe of texts, calculations, narratives and explanations, projections of an activity that is not magical.” (Flusser, 2011:9)


By contrast, Anonymous forms do not lend themselves to analysis, their direction being to circumvent and override as much as possible. What the associated memes and 4chan interactions do present us with however, are collaboratively made, creative network entities; In the changing dynamic by which these materials appear we see new conventions being worked out. Overloading standards of taste and acceptability are stimulating alternatives to the ordinary narratives of conflict and resolution.




Easeful Media

Overload gives rise to easeful interactions which go against any suppose disconnection between the Internet and Real Life. In TPB:AFK Pirate Bay founder Peter Sund explains assuredly to a Swedish courtroom, “We don’t use the expression IRL. We say ‘Away from Keyboard’. We think the Internet is for real.” (TPB AFK, 2013). In this documentary, which follows legal action taken in Swedish courts against the Pirate Bay founders, prosecutors denounce the ‘criminal organization’ they say is dedicated to amassing vast private wealth. Whilst the motivation and affiliations of the the Pirate Bay trio have remained opaque to prosecuting authorities, in the film, the question which achieves vital significance is “Who do you trust?”. This may be one point around which ‘easeful’ interactions revolve.


Disruptive Convergence

In these overloaded forms of representation which we see entering mainstream narratives, a kind of collective and competitive vandalism is esteemed. The multiplicity of voices – for which the expanding net has become more lightning conductor than conduit – increasingly provides its own self-fulfilling cycle of news, serving 24-hour comment and analysis for comment and analysis.


A re-writing is under way in which messages combining text and images produce networks within networks. They become the mutable containers of doubt and disinformation, of ill intent and ignorance.



Social media has refreshed the status of the Internet troll, but the nuanced subterfuge of spreading Fear, Uncertainty  Uncertainty and Doubt[15] looks now merely the preserve of some long-gone gentle sport. Flames, defamation and libel have become the norm. The specialized rules of email etiquette have evaporated. In the merging of media, products and social interaction, trolling itself has gone viral; self-validating intercourse has been upstaged by social-media-sanctioning broadcast-media discourse – tomorrow, your network may be the target[16]. In proceedings against the Pirate Bay a continuing game of cat and mouse has been played. Plaintiffs and defendants become complicit partners in a mystifying game of hide and seek. In this play, data appears to transfer seamlessly across frontiers, before reaching new data housing facilities (the implacable fortresses of this age).[17]



The fixation on data and hardware objects; the advance of our litiginous cultures – it may be that these elements contribute to conditions in which bullying can be blended into the landscape of human interactions. As much as hardware and new platforms may enable discourse, they also become the sites for abuse where differences between trolling and bullying easily merge.


During 2013 in the UK, a number of women in the public eye (among them MPs, campaigners and journalists) have become the target of insults and threats intended to silence their contribution to public discussion (Guardian, 2013c). Often these communications have been sent via twitter. In probably the most high profile case so far, this came after a successful campaign[18] to have the Bank of England print – for the first time – a female historical figure on its banknotes. Online, the equivocal status of networks is evident where ‘trash-talk’ in gaming turns to harassment and ‘ gamified misogyny’ (New York Times, 2013). In the competition for kudos, questions about the liberating potential of the Internet abound.





This identity fetishism promises certainty in a moment of profound uncertainty and harks back to a time in which physical media appeared more present than today; it is a moment where in many ways, absence may be more desirable than presence. The contradiction in interfaces is that in the moment they renounce claims on materiality (Co.Design, 2013), they retain the ability to expose us to actual and perceived threats[19]. Trolls revel in their ability to circumvent blocks, adopting new identities or labelling messages in ways so that they reach targets indirectly[21]. In the face of this melding together of anonymity and Free Speech, campaigner Caroline Criado-Perez ultimately chose not to observe the old advice of ‘do not feed the trolls’, but to delete her twitter identity (Guardian, 2013d).



In a broad sense, and in different domains, we are now seeing truth and responsibility increasingly under review: In the push to deliver up to the minute news, the sources and verifiability of content are an ever more present consideration. In concern for information ethics, in public and private domains, questions of accountability and trust – the veracity of versions – are paramount.


People can and do trust works produced by people they don’t know. The real world is still trying to figure out how Wikipedia works. A fantastic resource. Open source is produced by people that you can’t track down, but you can trust it in very deep ways. People can trust works by people they don’t know in this low cost communication environment.” (, 2005)


Version as Method

Though in software developer communities project forking has seen an upturn in its reputation, in some respects the proliferation of new cultural versions is problematic. The controversies around Wikileaks’ internal governance – far more that being positive examples of innovation and overflow – reflect transgressions of protocol which have led to the breakdown of trust. This has also been seen in disagreements between WikiLeaks and The Guardian newspaper over journalistic principle, if also different versions[21].


Post-irony for a Post-Digital Present

Participation has become one of the new watchwords for the web. As much as it also applies to images (subject to dissection and distributed storage across the Internet), the metaphor of the network is now universally applied to human collaboration:


Anonymous is a series of relationships. Hundreds and hundreds of people who are very active in it – who have varying skillsets, and who have varying issues they want to advance – these people are collaborating in different ways each day.” (BBC, 2012)


But exchange in networks also produces trust (and meaning) as repositories for doubt.


…since images are two-dimensional the representations in them form a circle, that is, one draws its meaning from the other, which in turn lends its meaning to the next. Such a relationship of exchangeable meanings is magical.” (Flusser, 2011:9)


Agent-memes now turn to the street:

hs_meme1  hs_meme2




[1] OHCHR (2013) Press Statement by the United Nations Special Rapporteur on adequate housing: End mission to the United Kingdom of Great Britain and Northern Ireland, 29 August to 11 September 2013 [WWW Document], (accessed 9.29.13).

[2] Mail Online Outrage as “loopy” UN inspector lectures Britain: She’s from violent, slum-ridden Brazil, yet still attacks us on housing and human rights [WWW Document], URL (accessed 9.29.13).

[3] About – Formamat [WWW Document], URL (accessed 9.29.13).

[4] Wayback Machine [WWW Document], URL (accessed 9.29.13).

[5] Welcome to Seppukoo / Assisting your virtual suicide [WWW Document], URL (accessed 9.29.13).

[6] gource – software version control visualization [WWW Document], URL (accessed 9.29.13).

[7] SVS [WWW Document], URL (accessed 9.29.13).

[8] Urban Versioning System 1.0 [WWW Document], URL (accessed 9.29.13).

[9] OpGraffiti’s deviantART Gallery [WWW Document], URL (accessed 9.29.13).

[10] Know Your Meme (2013) Ed Balls [WWW Document], URL (accessed 9.29.13).

[11] Twitter / WhiteHouse: Bo, stop trying to make fetch … [WWW Document], URL (accessed 9.29.13).

[12] Cybraphon [WWW Document], URL (accessed 9.9.13).

[13] GLAM – Outreach Wiki [WWW Document], n.d. URL (accessed 9.29.13).

[14] Guardian (2013b) New computer program analyses Twitter to map public sentiment [WWW Document], URL (accessed 9.9.13).

[15] Wikipedia Fear, uncertainty and doubt [WWW Document], URL,_uncertainty_and_doubt (accessed 9.29.13).

[16] The Register Feds seize Indymedia servers [WWW Document], URL (accessed 9.29.13).

[17] see for example trailers for InRealLife Official UK Film Site [WWW Document], URL (accessed 9.29.13) and TPB AFK (2013) The Pirate Bay:Away From Keyboard [WWW Document], URL (accessed 9.29.13).

[18] The Women’s Room Bank Notes [WWW Document], URL (accessed 9.29.13).

[19] “I felt like the chat box could see me through the computer screen.”The UnSlut Project: how sexual bullying ruined my childhood [WWW Document], URL (accessed 9.29.13).

[20] “…online abusers continued to find “new and imaginative ways” to contact her, through her blog” Caroline Criado-Perez says culture must change as rape threats continue [WWW Document], URL (accessed 9.29.13).

[21] Gibney, A We Steal Secrets: The Story of WikiLeaks (Documentary) 2013




BBC (2012) Barrett Brown in How Hackers Changed the World – We Are Legion (BBC Documentary)
Co.Design (2013) Irony: Just As 3-D Interfaces Are Getting Good, Apple’s UI Is Going Flat [WWW Document], URL (accessed 9.29.13). (2005) Ward Cunningham on the Crucible of Creativity. Many-to-Many: [WWW Document], URL (accessed 9.29.13). (2013). About – Formamat [WWW Document], URL (accessed 9.29.13).


Flusser, V (2011) Into the Universe of Technical Images Minneapolis : University of Minnesota Press


Guardian (2013a) UN housing expert’s call to axe bedroom tax “a disgrace” – senior Tory [WWW Document], URL (accessed 9.29.13).


Guardian (2013b) I fear the chilling effect of NSA surveillance on the open internet [WWW Document], URL (accessed 9.29.13).


Guardian (2013c) Mary Beard suffers “truly vile” online abuse after Question Time [WWW Document], URL (accessed 9.29.13).


Guardian (2013d) Caroline Criado-Perez deletes Twitter account after new rape threats [WWW Document], URL (accessed 9.29.13).


Heath Cull, D (2011) The Ethics of Emerging Media. Information, Social Norms, and New Media Technology London : Bloomsbury Publishing


Kelty, C 2008. Two Bits: The Cultural Significance of Free Software and the Internet. Durham : Duke University Press.


Nelson, T (1974) Computer Lib/Dream Machine.> Self-published.


NYTimes (2012) Anita Sarkeesian quoted in Game Theory: Making Room for the Women [WWW Document], URL (accessed 9.29.13).
Reagle Jr., M (2011) Good Faith Collaboration The Culture of Wikipedia Cambridge MA : MIT Press


Shaikh, M (2012) Mutability and becoming : materializing of public sector adoption of open source software. In: IFIP WG 8.2, Working Conference, Tampa, Florida, U.S.A, 13-14 Dec 2012. Published in: Shaping the Future of ICT Research. Methods and Approaches, Volume 389 pp. 123-140.


TPB AFK (2013) The Pirate Bay:Away From Keyboard [WWW Document], URL (accessed 9.29.13).


Turner, F (2006) From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. Chicago : University of Chicago Press
Warner, M (1999) Publics and Counterpublics: The Trouble with Normal New York : The Free Press


Ubuntu (2013) “git” package [WWW Document], URL (accessed 9.29.13).

Warner, M (1999) Publics and Counterpublics: The Trouble with Normal New York : The Free Press

Propaganda and Decision Ecologies

How might future research into digital culture approach a “post-digital” age?

One of the many problems comes from the discourse of ‘the digital’ itself: a moniker which points towards units of Base-2 arbitrary configuration, impersonal architectures of code, massive extensions of modern communication and ruptures in post-modern identity. Yet, it would quite difficult to envisage a ‘post-computational’ break from these discourses – and with good reason: for the actual specific structures upon which computational experimentation arise, are never really discussed at length. I’d like to consider the notion that before we ever entered a digital age, we live in an ecology of decisions, or better yet, decisional ecologies.


First a note on the context. My thesis attempts to address how computational aesthetics is possible when understood through the conjunction of two frameworks: the history of meta-mathematics and a realist ontology. I’ve probably lost everyone right there, but in a different way, the research only seeks to incorporate the mathematical story of function, logic and proof (and the failure thereof) back into the agency of aesthetic expression. Not just for the basis of computation, but also its aesthetic. This might be the case for how artists use computation politically or sensually, but also why aesthetics should be present at all in, what is essentially, an automated vehicle for proving theorems.

It is widely understood that the theoretical basis of computation, derived from Alan Turing, is to create a formal system of logic, which when automated, would solve particular mathematical problems put into function. What is not necessarily understood is the mathematical context to that basis: the foundations of mathematics was already precarious, way before Turing published his landmark 1936 paper, On Computable Numbers, with an Application to the Entscheidungsproblem. It is a precariousness which has been built-in to computation from its very inception.

The key word of that paper, its key focus, was on the Entscheidungsproblem, or decision problem. Originating from David Hilbert’s mathematical school of formalism, ‘decision’ means something more rigorous than the sorts of decisions in daily life. It really means a ‘proof theory’ [1]

Decision is what happens when a formal system of function is constructed in a sufficiently complex way, that an algorithm can always ‘decide’ a binary, yes or no answer to a mathematical problem, when given an arbitrary input in a sufficient amount of time. It does not require ingenuity, intuition or heuristic gambles, just a combination of simple rules and a careful avoidance of contradiction.

The two key words here are ‘always’ and ‘decide’ – the progressive end-game of 20th Century mathematicians, like Hilbert who were addicted to the buzz of demonstrating proofs, sought one simple totalising conceptual system, to decide every query, silence any dissidence, and work towards absolute knowledge. All Turing had to do was make explicit formalism’s implicit computational treatment of formal rules. Later on Turing would call this an ‘effective’ or ‘systematic procedure.’

Effective procedures decide problems, they resolve puzzles and provide winning positions in the game of functional rules and formal symbols. In Turing’s words “a systematic procedure is just a puzzle in which there is never more than one possible move in any of the positions which arise and in which some significance is attached to the final result.” [2]

Already in 1936, Turing showed how machinic decisions as mathematical ideas could model and replace human ones, and how, given a sufficient complexity certain effective procedures, (like Universal Turing Machines) could simulate the functional decisions of other effective procedures. Ten years later, Turing and John von Neumann would independently show how automated physical machines, and general purpose computers offered the same thing. From that moment on, decisions manifested themselves in materials. Programs were simply proofs. Code is function. Before there was Shannon’s information theory and the encoded logic of messages, we had Hilbert and Turing’s computational structuring of information in an underlying form of decision.

Yet, on a meta-level, Turing was fascinated by what decisions couldn’t do, just as much as their ability to automate proofs and provide flexible universal function. Unlike Hilbert, Turing was not interested in using computation to solve every problem, but as a curious endeavour for surprising intuitive behaviour. The most important of all, Turing’s halting, or printing problem was influential for him, precisely as it was undecidable, a decision problem.

We can all picture the halting problem, even obliquely. Picture the frustrated programmer or mathematician starting at her screen, waiting to know when an algorithm will either halt and spit out a result, or provide no answer. The computer itself contains all the totalising knowledge, the programmer just has to know when to give up.

But this is a myth, inherited with a bias towards human knowledge, and a demented understanding, of machines as infinite calculating engines, rather than concrete entities of decision. Turing didn’t understand the halting problem in this way, instead he understood it as a contradictory example of decisions failing to decide on each other, on the account that there can never be one totalising decision or effective procedure. There is no guaranteed effective procedure to decide on all the others, and any attempt to build one should be regarded with suspicion. Turing suggested that ‘propaganda’ was more appropriate to this meta-level than simple proof. [3]

Programs might be proof-deciders, but there is no general decision-procedure for solving all problems. Undecidability then, is what happens when formal systems and decisions (whether conceptual or physically embedded in computation) can never ultimately decide on a solution, in the absence of a general systematic procedure. Decisions, including human ones, are doomed to decide, not to know.


Now there’s a lot in that, but what does any of these mean for aesthetics? One could point towards a couple of moves. One of the potential insights gained from the conflicting consequences of decision, is it re-ignites something about the philosophical remit of aesthetics, descended from Immanuel Kant: aesthetics is not in the business of solving problems. Moreover, an aesthetic effect is rule-based, yet paradoxically its power, politically and sensually, lies in having no explicit rules to follow. Aesthetics might be in the remit of applying and endorsing decision problems.

Yet, we should also be wary of sticking with Kant’s functional basis for grounding aesthetics in the basis of transcendental thought alone. This is where the realist ontological aspect of the thesis kicks in, because whilst the meta-mathematical basis for computation is important for understanding the origin of computation as decision, it doesn’t grasp what sort of critical effects these decisions have within culture, contemporary arts practice and big business. It doesn’t grasp the level of adaptability required for negotiating different types of decisions in computational culture, and how the binary logic of ‘provable’ and ‘non-provable’ simultaneously offers the means of subversive expression whilst condemning us with hegemonic means of control. It does not address the ecological reality of decision that takes place.

What is clear is that our world is no longer simply accountable to human decision alone. ‘Culture’, is no longer simply guided by a collective whole of human decisions, nor is it reducible to one ‘natural’ collective decision. Rather the collective world is comprised and composed of an ecology of decisions: a collection of decision-centric autonomies, which surround and often presuppose any individual or collective decisions that get underway, both social and technological. Before there was ever the networked protocol, there was the computational decision, an effective procedure to decide in the first place. Decision ecologies already takes place before we enter the world, implicitly coterminous with our lives: explicitly determining a landscape upon which an individual has limited manoeuvrability.

Decisions are everywhere and in everything. Look around. We are constantly told by governments and states that are they making tough decisions. CEOs and Directors make tough decisions for the future of their companies. ‘Great’ leaders are revered for being ‘great decisive leaders’, not just making decisions quickly and effectively, but also settling issues and producing definite results. Even the word ‘decide’, comes from its Latin origins of ‘decidere’, which means to determine something and ‘to cut off.’ Algorithms in financial trading know not of value, but of decision: whether something is profit or loss. Drones know not of human ambiguity, but can only decide between kill and ignore. Making a system which decides, between two or more routes, means cutting off and excluding all other options leaving a final result at the end of the procedure. Making a decision, or building a system to decide a particular ideal or judgement, must force other alternatives outside of it. Decisions are always-already embedded into the framework of action, always already deciding what is to be done, how it can be done or what is threatening to be done. It would make little to sense to suggest these entities ‘make decisions’ or ‘have decisions’, it would be better to say that they are decisions: they decide what choices can be made within them.

The question is no longer simply ‘who decides’, but now, ‘what decides?’ Is it the cafe menu board, the dinner party etiquette, the NASDAQ share price, the new Google Search algorithm, railway network delays, unmanned combat drones [4], the newspaper crossword, the javascript regular expression or the differential calculus? It’s not quite right to say that algorithms rule the world, whether in algo-trading or in data capture, but the uncomfortable realisation that real entities are built to determine provable outcomes time and time again, in the midst of automating questionable procedures: most notably ones of profit, revenue and ideology. One of the more stronger examples is George Dantzig’s Simplex Algorithm, (whose origins began in multidimensional geometry), which always decides solutions for large scale optimisation problems. It’s proliferation and effectiveness has been critical for 30 – 40 years of successful capitalist production, deciding almost everything from bus timetables, work shift patterns and trade shares. According to optimisation specialist Jacek Gondzio, the simplex algorithm runs at “tens, probably hundreds of thousands of calls every minute.”

Turing was almost prophetic in calling it propaganda, despite him writing about mathematics and not aesthetics or politics. For what is propaganda if not an effective method of disseminating and structuring information through mass ideological control, decision and persuasion? Political Propaganda, as Frederic Charles Bartlett knew [5], was primarily a decisive method of suggestion, not simply designed to control psychological behaviour, but to acquire specific, effective results through purposeful action.

Propaganda operates as if it can produce idealised solutions to problems, but in its operation, must hide uncomfortable paradoxes which allow its communication to occur in the first place. Perhaps a post-digital realisation, of culture might address newer forms of propaganda emergent in computational culture: not posters, pamphlets, zines and broadcasts, but videogames, gamification, devices, spy-ware, apparatuses, services and subscriptions: each one only allowing certain outcomes to be realised, each one already deciding (or propagating), a limited number of routes, which users mistake for their own freedom. If there is one thing Silicon Valley would love to solve it’s how can we tell if a problem always has a solution: and whenever they come up with one, it usually has a market to satisfy.

But I’ve stopped writing about aesthetics. Or have I? Remember when the far Left used to be good at propaganda? If we were never post-ideological, then we have never been post-propagative either. Decision ecologies present numerous problems, primarily as they decide what can be done and what can be negotiated in crafting anything whatsoever. What is required of aesthetics nowadays, is not an explicit rejection of decision, but an implicit affirmation of a decision problem. Whenever artistic practice seeks to undermine decision and open up a space for discussion, it must suddenly realise that purposeful action is required: which is to say that aesthetic practice might present better propaganda. Even open source software is propaganda of sorts: a principled ideal embedded in computation, where free use as a precondition of its modification, is already decided on account of its dissemination.

This sounds absurd, but why not? Decisional ecologies do not have a teleology, yet they are all we have, and aesthetics has a historical prevalence, not from burrowing towards the formal truth of rules, but addressing why rule based judgements and decisions have no rules to follow. Exploitative hacking for example is the very sort of purposeful action which undermines decision in a given systems, whilst simultaneously demonstrating new automated proofs which were previous undecided. [6]

Decisional ecologies are grounded in social and technical entities, which makes certain analogous practices stretch over disparate mediums. The problem with computation is that the skill set required to make regulated systems undecidable is set quite high. But one can learn from other areas.

For example, one can cite Julius von Bismarck’s famous Image Fulgurator (2007-present) series of interventions. von Bismarck’s invention projects an optically triggered hidden image (usually at press events), which is only detectable after the event, after it is triggered by flash snapshot. The Fulgurator itself is a modified SLR camera which has a reverse trigger flash sensor, only secondarily triggered by external camera flashes. This device stealthily projects an image, the moment a flash is taken, effortlessly infecting every possible snapshot, without anyone knowing.

Crucially, von Bismarck’s interventions actually take place in the events themselves: a cross on Obama’s podium in Berlin, or a giant ‘NO’ beamed above the Pope. It gives such technical structures something that they cannot decide on (to prevent the technology falling into commercial hands, von Bismarch even took the decision to patent it).  In such events where decisions are tightly controlled, and contingencies are calculated, von Bismarck demonstrates an exploitative method which undermines this particular concrete environment of decisions, and exposes a certain undecidability: a hole or gap within the procedure previously unnoticed.

Because decisional ecologies are never total, it is always possible to undermine totalising systems of propaganda, no matter how extensive. But doing so is not pre-given, and requires not totalising knowledge, but a negotiation of what present decisions can currently afford. Such is the focus of a decisional ecology, it can only spit out more decisions: what matters are the ideals and the future conflicts that are embedded in its place within this ecology.


1 David Hilbert, ‘Probleme der Grundlegung der Mathematik’ [Problems Concerning the Foundation of Mathematics], Mathematische Annalen, trans. Elisabeth Norcliffe 102 (1930), p. 3, cf. 1-9.

2 Alan Turing, “Solvable and Unsolvable Problems,” in Science News, #31 (1954) p. 18, cf. 7 – 23. citations hereby taken from Jack Copeland, The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life: Plus Secrets of Enigma, edited by Jack Copeland, (Oxford: Oxford University Press, 2004), p 590. cf. 582-595.

3 Quoted from  Turing, “Solvable and Unsolvable Problems,” p. 588 (emphasis added)

“This statement is still somewhat lacking in definiteness, and will remain so [...] The statement is moreover one which one does not attempt to prove. Propaganda is more appropriate to it than proof, for its status is something between a theorem and a definition. In so far as we know a priori what is a puzzle and what is not, the statement is a theorem. In so far as we do not know what puzzles are, the statement is a definition which tells us something about what they are.”

4 See Daniel Suarez’s science fictional (but self researched) account of robots and drones automating military killing decisions in Daniel Suarez, Kill Decision, New York, Penguin Books, 2012.

5 F. C. Bartlett, Political Propaganda, (Cambridge, Cambridge University Press, 1940).

6 See Sergey Bratus, Michael E. Locasto, Meredith L. Patterson, Len Sassaman, and Anna Shubina, (2011) “Formal Applications of Formal Language Theory – Technical report TR2011-709”, Dartmouth Computer Science. See also; Sergey Bratus, Michael E. Locasto, Meredith L. Patterson, Len Sassaman, and Anna Shubina, “Exploit Programming: from Buffer Overflows to “Weird Machines” and Theory of Computation”, ;login:, December 2011 and for a more detailed analysis on fundamental x.509 security protocols, see Dan Kaminsky, Len Sassaman, and Meredith Patterson, “PKI Layer Cake: New Collision Attacks Against The Global X.509 CA Infrastructure”, Black Hat USA, August 2009, <> last accessed September 1st, 2013.

Augmented Browsing: The post-digital aesthetics of data intervention

To hack is to refuse representation, to make matters express themselves otherwise. To hack is always to produce a difference, if only a minute difference, in the production of information.  - McKenzie Wark

Since the 90s with the introduction of the World Wide Web, hacking has been used as an artistic strategy for network artists to explore the potential and possibility of alternate art forms. Artists have abandoned the formal representation of their artworks to present something intentionally, often regarded as nonsense or chaos. Perhaps, Jodi’s famous works such as and %20Wrong, shown in Fig. 1 and Fig. 2, best illustrate the ideas of intentional chaos, and this is how White describes “the aesthetics of failure” (2002) in network art. However, White only discusses intentional or representational failure, as this article argues, unintentional failure should be taken into consideration as part of the aesthetics of failure.

Picture 1
Figure 1. Screen capture of, by Jodi

Picture 6
Figure 2. Screen capture of %20Wrong by Jodi

The notion of intentional failure can be traced in earlier forms of artistic practices including Dada and Surrealism (Ibid), in which the representation, particularly visual objects and composition, is distorted, disoriented and misplaced in an unusual manner, leading to a sense of nonsense, chaos and instability, see example in Fig 3, which challenges the way an audience approaches and understands an artwork.

Figure 3#. Persistencia de la memoria (1931) by Salvador Dali

From the earlier browser art in the 1990s to recent browser add-on development in the late 2000s, network artist uses a browser and develop a customized-software to intervene the usual understanding of the underlying structure and the representational form of World Wide Web. With the proliferation of Internet technology, in particular the Web 2.0, participatory platforms, such as social media applications, are increasingly embedded in our practices of everyday life; artists tend to shift the focus from exploring the medium specificity to critiquing the network culture and outlying the politics of the platforms through the strategy of intervention, subverting the behaviors of how a website should function or be represented. What is important in a website is data in the forms of text, numbers, images and videos. The concern with data is noted by Jilian Stallabrass as the “most fundamental characteristic” of network art “and can be thought of as a variety of database forms” (Stallabrass, 2003, p.26). Data that exists in the participatory platforms is structured and organized through databases and computation. These data are economically valuable, publicly engaging and artistically playful, something the article describes as peculiar data. This article explores how peculiar data is being manipulated in network art with a particular focus on browser art. It also introduces a collaborative network art practice, The likes of Brother Cream Cat (Pritchard & Soon, 2013), a mediatized browser add-on which draws peculiar data from the Facebook’s metadata source, and generates an augmented browsing experience. The work highlights the notion of failure with a mix of intentional and unintentional motifs, the controllable and uncontrollable aspects of peculiar data, exploring the forces behind and beyond Facebook’s interface.

A browser is a piece of software that allows data to be presented in a readable format. Browsers designed to install and run on personal computers are able to display Internet data in multimedia forms (text, images and video); it becomes an interface that communicates between the user and the Web developer, translating computer-scripting languages into information as multimedia objects. It is an important milestone in human-computer history, as it allows data to be displayed in graphical and colorful forms, and is accessed by anyone with the Internet. This enhances readability and facilitates information sharing. It later becomes one of the pervasive and ubiquitous media that is included in operating systems, Windows, Mac and Linux platforms, as well as Android and Iphone OS, for example.  The first browser, Mosaic, was introduced in 1993, followed by Netscape Navigator (1994) and Internet Explorer (1995). Since then, other browsers have appeared in the market such as Chrome, Firefox and Opera etc. Conceivably, a browser becomes an important medium and interface, it has the capacity and power to allow data to be translated, transported and transformed from one location to another, and from one format to another.

According to Christiane Paul, browser art is about “the creation of alternative browsers to navigate and present Web data” (Paul, 2003) and “art that recognizes the authority of the Web browser” (Liu, 2004). Many network art projects fell into this group, notably alternative web browsers, including but not limited to Web Stalker (1994) produced by I/O/D and RIOT (2000) produced by Mark Napier (See Fig 4 and Fig 5). These artworks challenge the conventional role of a browser and subvert the notion of web content through an intentional play of failure. Failure in a way means a browser does not function as normal or expected behaviors, and left behind to the audiences and users are the chaotic interfaces with a malfunction browser. However, these browsers are not realistically retarded, rather the web content is intentionally deconstructed via artists’ custom-made software, and audiences are urged to look at the matter, the latent and connected structure, behind the representation of web data and hyperlink, as White suggests these disruptions “can encourage computer spectators to read Internet technologies differently” (2002, p.173).

Figure 4. WebStalker (1994) by I/O/D, image source:

Picture 8 16-45-52
Figure 5. Screen capture of RIOT (2000) by Mark Napier

Working with web data has been a keen interest of network artists. With the advancement of technology, a more open architecture of a browser like Firefox and Google Chrome, browser add-on has gained currency in recent years. It is a small application that runs on top of a browser, providing additional features to the browsers mainly for data manipulation, such as auto online form filling and the disabling of web advertisement images. It changes the way a web page is displayed and usually encompasses little varieties of functions to keep it as a small application. Browser add-on is also seen as a tactical media art form that runs interventions in the media landscape. Facebook Demetricator (See Fig 6) is a browser add-on that hides all the numbers on the Facebook interface. Benjamin Grosser, the artist, critiques the social value of numbers, measures or metrics, for instance the number of likes, friends and comments, in Facebook, which is affecting users’ emotions. These numbers are not simply data, they are regarded as peculiar data that contain economical value towards advertiser and Facebook,  collecting a better profiling[1] and offering a more targeted advertising. In addition, these numbers are publicly engaging and might lead to a rippling effect which is affecting users’ judgment on the content (DeVries and Soltani, 2012) and “may have consequences on how [they] act within the system” (Benjamin, 2012). The feature of intentional hiding of data is built into the browser add-on, causing the failure of the metrics display, but this project informs the more political, social and cultural aspects of aesthetics where White draws on feminist aesthetics and Foster’s anti-aesthetics (White, 2002, p.174) to explain the notion of failure.

Figure 6. Facebook Demetricator (2012) by Benjamin Grosser, image source:

To extend these political, social and cultural aspects of aesthetics, this article introduces a network art project. The likes of Brother Cream Cat (2013), also an add-on that functions on Facebook, is the latest collaborative project of Helen Pritchard and Winnie Soon. The artwork takes a popular Facebook cat “Brother Cream” (a cat that lives in a 24-hour convenience store with the shop owner in Tsim Sha Tsui, Hong Kong) as an analogy to explore the politics of failure through continuously scraping and distorting Facebook data. In 2011 “Brother Cream Cat” walked out of his shop and went missing; his fans created a Facebook account to find him, and on his return he became ‘Facebook Famous’ through his ‘lots of likes’. Since then he commands over 1000 visitors per day at his shop in Tsim Sha Tsui and with more than 145,000[2] fans on Facebook. The likes in his Facebook fan page become an instrument to keep his life active by having more visitors (both online and offline), more merchandised products, more cat food and more job opportunities for this animal celebrity, Brother Cream.

Once the add-on is installed and activated, all the Facebook’s peculiar data (including images in posts, profile picture, timeline image) will be replaced with the latest Brother Cream trace, see Fig. 7, and special effects (both audio and visual effect through interactive play), see Fig. 8, have been intentionally implemented on the particular Brother Cream’s Facebook fan page. As such, the add-on intervenes the usual behavior of browsing and using Facebook through a customized program, offering an augmented browsing experience on the fly. The image data on Facebook page is constantly mutating and Brother Cream Cat’s trace is participating actively in users’ social communication.

Picture 8
Figure 8. The Likes of Brother Cream Cat (2013), the effect on Facebook
Picture 40

Figure 9. The Likes of Brother Cream Cat (2013), the effect on Brother Cream Cat fans page.

Starting from the landing page, see Fig. 10, to the Facebook page, the interfaces depict a sense of chaos and messiness through intentional representation. The project uses a web scraping technique as opposed to Graph API[3] (a standard specification offers by Facebook to developers), which adds another level of unpredictability or an uncontrollable outcome. Web scraping, in this project, is a technique to extract data from a webpage directly without the need to go through registration or authentication through a program that communicates with Facebook, as well as not following the official guidelines that are provided by Facebook. In computer science terms, client-side programming is used instead of server-side programming, and this means the source code of Facebook is examined. Studying Facebook’s HTML source code is one of the approaches in this project, through “view source” option available in the browser menu, in order to identify appropriate data (for example friend’s image versus group’s image in Facebook). This is not a standard way and one of the major drawbacks of the code is highly unstable. It is just like finding a folder in a specific drawer, and if Facebook changes the drawer’s location or swaps the folder’s position, the program will be unable to extract the right information, causing the whole add-on to malfunction.

On the contrary, using the Graph API from Facebook will ensure any changes of the drawer would not cause any impact, or at least to minimize the impact, to the developer’s program. The API approach is also been observed as a common way that many Internet platform providers (such as LiveJournal as one of the earliest providers) have offered this service to developers publicly since the early 2000s (O’Reilly, 2005). This method allows a proper control from providers of how the platform’s data is being used and what kinds of data is being used. Therefore, the artists put their program at their own risk if they still use a scraping technique, which is also regarded as an earlier method of web data extraction (Ibid). One of the attempts of The likes of Brother Cream Cat is to escape from formalism, not only on a representation level, but also on how the work is constructed in order to explore this parasitical relationship between the artists’ program and Facebook, to document the different versioning of the add-on in order to reveal the uncontrollable peculiar data and interfaces of Facebook. Potentially, the failure of the add-on signifies the change of Facebook’s interface as they might change the data format, location and name for example, such that the artist’s program will no longer point to or extract from the right path or a right object. The Likes of Brother Cream Cat, as an add-on, has to be functioned with the existing representational data. In other words, the aesthetics of failure in this project lies between a coupling of intentional and unintentional motifs that is embedded in the add-on, and who has actually gotten the control of the add-on becomes blurred. Paradoxically, the artists intentionally intervene the Facebook data, however, Facebook itself has a strong opposing force that might cause the add-on to malfunction unintentionally, turning back to a ‘normal’ state of the Facebook.

This has reminded people to think about the frequent and seamless Facebook interface changes and their motifs of such newly added features (for example, timeline, the likes history log and comment editing), cultivating a more publicly engaging environment and enhancing the precision of data analysis, as such, it becomes more economically valuable. Therefore, the versioning of a software is not only providing enhanced features like an update or a fix of an application, but also documents the changes of technical media environment. There is a hidden, yet a strong force that might cause subsequent changes of the add-on.

Picture 9
Figure 10. The landing page of The Likes of Brother Cream Cat (2013).

In conclusion, network art always encapsulates the notion of anti-institutionalization and anti-commercilization. The artworks that have been discussed, from early browser art to recent browser add-on, have used the strategy of artistic intervention to explore the matters of the Internet. This article suggests that the aesthetics of network art, in particular what White has described as intentional failure, not only lies in the representational failure of running the software, but should also look at a more hidden process and politics of network culture in order to understand the underlying motifs of unintentional failure. The relations of cultural and social processes are regarded as forces, which have the capacity to keep the artwork as well functioned and live, but also can lead to malfunction and to death.

# This artwork image may be protected by copyright. It is posted on the site in accordance with fair use principles.

[1] Profiling allows data, in particular user’s life style, behaviors, patterns and habits to be captured through an online platform.

[2] The figure is a snapshot as of 24 September, 2013 and see his fans page here:

[3] See the Facebook development page here,


Grosser, B., 2013. Facebook Demetricator. [Online] Available at: <> [Accessed September 20, 2013].

Liu, A., 2004. The Laws of Cool: Knowledge work and the culture of information, Chicago, London: The University of Chicago Press.

O’Reilly, T., 2005. What Is Web 2.0: Design Patterns and Business Models for the Next Generation of Software. [Online] Available at: <> [Accessed October 20, 2012].

Paul, C., 2003. Digital Art. Thames and Hudson.

Stallabrass, J., 2003. Internet Art: The Online Clash of Culture and Commerce. London: Tate Publishing.

Valentino-DeVries, J and Soltani, A., 2012.  How Private Are Your Private Facebook Messages. Digits: Tech News & Analysis From the WSJ.[Online] Available at: <> [Accessed September 20, 2013].

Wark, M., 2004. A Hacker Manifesto. Harvard University Press.

White, M., 2002. The Aesthetic of Failure: Net Art Gone Wrong, Angelaki, 7:1, pp.173-194.


Actions of Matter

In 1968 when Lucy Lippard gathered the collective conceptual practices of the time and packaged them up as “dematerialized” I was six. In a way I have always been dematerialized, or at least I can never remember a time when art was not.

So now as an artist practising in an era of the “internet of things”, where online services and digital fabrication have blurred the boundaries between the material and the immaterial, what constitutes materiality?

In this paper I want to examine parallels in the constructs of materiality within my own hybrid digital/sculptural practice and that of 1960s conceptual art practices – in particular Robert Morris’ performance work Site,1964 and Alan Kaprow’s Eighteen Happenings in Six Parts, 1959, in order to develop an understanding of how we might go about engaging the digital as a material in a manner consistent with other material sculptural practices.


These two works from the 1950/1960s serve as examples[1] of a period in which new methods of interrogating materiality were being explored, and present a method from which we might go about approaching “the digital” in order to develop a practical understanding of digital materiality.




“These are forms of behaviour aimed at testing the limits of possibilities involved in that particular interaction between one’s actions and the materials of the environment.” Morris, R. (1970)

As artists associated with Lippard’s dematerialised “ultra-conceptual practices” (Lippard, 1973), both Morris and Kaprow were instrumental in our contemporary understanding of materiality. As Jacob Lillemose explains, Lippard’s dematerialisation of art as an object is not an argument for the disappearance of materiality but a rethinking of materiality in conceptual terms (Lillemose, 2008).

“…instead of understanding dematerialization as a negation or dismissal of materiality as such, it can be comprehended as an extensive and fundamental rethinking of the multiplicity of materiality beyond its connection to the entity of the object.” (Lillemose, 2008.)

This non-corporeal attitude to materiality establishes an argument where immateriality becomes a new material condition (Lillemose, 2008). With materiality defined as being immaterial, we can conceive of “the digital” as possessing materiality once we accept “the digital” as a structural method rather than a technological function.

“…dematerialization designates a conceptual approach to materiality whereas immateriality designates the new material condition – or just a new material” (Lillemose, 2008).

So what is this digital thing?

As loosely used terminology, digital is used largely as a qualifier of an object – for example digital-media, digital-network, digital-camera… Thus digital media is distinct from “the digital” in the sense that it is an artefact of that which is digital. The digital is really the underlying structural methodthat results in the production of what we call digital media.

In this argument I am extending Lewis’ widely accepted definition of “the digital” as being a discrete representation in opposition to the analogue, which he describes as a continuous representation (Lewis 1971). While the differentiation between discrete and continuous modes provides a sound definition of “the digital”, I reject the necessity of any representational modality as mediation through representational systems unnecessarily distances us from a subject.

While digital media operates from an imposed modality that is in representational deference to analogue materiality, the digital’s materiality should not be bound by representation any more than analogue material. Rather “the digital”, as proposed by Barbara Bolt in her counter-representation reading of Heidegger, should be located in a dynamic non-representational space directly between artist and material, thus eliminating the necessity of any representational mediation by digital media.

“According to such a counter-representational understanding of art, the work of art is no longer an object for a subject; the relationship between artist, objects, materials and processes is no longer one of mastery and all elements are co-responsible for the emergence of art” (Bolt, 2004).

It is precisely this co-dependent dynamic between human and non-human actants that Leonardi clarifies in regard to digital-media. Arguing for a definition of materiality that is inclusive of instantiations of non-corporeal agents, Leonardi stresses the affordance of materials rather than their physical properties, stating that it is in the interaction between artefacts and humans that the materiality is constituted (Leonardi, 2010).

“These alternative, relational definitions move materiality ‘out of the artefact’ and into the space of the interactions between people and artefacts. No matter whether those artefacts are physical or digital, their materiality is determined to a substantial degree by when, how and why they are used. These definitions imply that materiality is not a property of artefacts but a product of the relationships between artefacts and the people who produce and consume them” (Leonardi, 2010).

With materiality liberated from both representation (Bolt, 2004) and corporeality (Lillemose, 2008 and Leonardi, 2010), the argument for a materiality of intent within process returns us to the work of Lippard’s “ultra-conceptual” artist of the 1960s. Although predating Lippard’s seminal text on dematerialisation, aspects of Morris’ performance works of the 1960s taken in the context of his subsequent sculptural practice articulate this approach to materiality.

Site, originally performed by Morris and Carolee Schneemann in 1964, starts and finishes with Morris standing in front of a small white rectangular block of similar proportions to a large cuboid in the centre of the space. During the course of the performance Morris removes panels form the larger box revealing a reclining nude figure posed as Olympia (Édouard Manet, 1863). The noise of a jack-hammer is also heard throughout the performance.

morri_site_dWhat is of interest here is not the narratives of the work but the interactions between Morris and the plywood. Morris is seen to manoeuvre the plywood slowly and deliberately though a series of actions: lifting, rolling, flipping… The artist is seen to be intently focused on the task at hand which, given the size and weight of the sheet, would have required some concentration and physical exertion.

While each action is short and relatively unimpressive, breaking it down in individual frames shows how a material dynamic is formed between the body and the plywood sheet.

Morris_RollAs Morris moves the board from one side of his body to the other by rolling it over his back, the board becomes both subject and object. By the same token, the artist’s body is doubled as if performing some unbounded cartwheel. In the tension of the space between the two neither are dominant – each yields to and demands of the other in the same way to constitute the materiality of the work.

Somewhat later in the Phenomenology of Making, (1970) Morris writes of this idea of finding form in the activity of making by testing the limits of a material against the body (Morris, 1970). Clearly, when Morris speaks here of interacting with a “material in relation to (rather than in control of)” it, he is expressing the idea of co-constituted materiality that is seen in Site (Morris, 1970).

Øform (2011) makes similar claims to a shared agency through the use of a haptic modelling system in which the performative actions of the artist constitute a materiality in a network with digital-media. To be clear, I am not suggesting that this work engages digital materiality. Rather it is seen as indicative of a means of engaging with a non-corporeal material agent that might subsequently be applied to materialising “the digital”.

FORMØSMLØform uses Microsoft Kinect to track the spatial coordinates of the artist’s hands in order to generate 3D forms within CAD software. What is of interest to me here is not so much the resultant forms but the structural method through which they are achieved, that forces the body into a shared agency with the digital-media.

Through algorithmic analysis of the gestures, the artist’s body becomes spatially disassociated from the virtual form, and the artist must defer his movements to the virtual content. Action becomes dissociated from outcome as anatomical norms of spatial organisation are redefined by the system.

As with Morris, the artist is intensely focused on the material subject that in return instructs the movement of the body. The agency here is identical to the co-constituted materiality identified in Site – in the exchange between action and material neither are dominant. Each yields and demands of the other in the same way to constitute the materiality of the work. (The software yields intent to the artist as the artist surrenders bodily action to the software.) It is in this engagement that the materiality of the work is contrived.

handgestISEA In a contemporary context any argument for shared agency is of course reliant on Latour’s Actor Network Theory.  While Latour’s principle of irreduction supports an autonomous reading of “the digital”, his insistence on the equality of agents in a network fails to acknowledge the instigative and intentional role of the artist in the work.



article00Addressing this problem, Kirchhoff offers an interpretation of ANT that supports a shared agency of materiality that privileges embodied experience. For Kirchhoff, “material entities do not have agency as an intrinsic quality by virtue of their materiality” (Kirchhoff, 2009). Like Leonardi, Kirchhoff’s materiality exists only “if the concept of ‘material agency’ is a relational and asymmetrical quality… that emerges in the ‘symbiotic interplay’ between human embodiment and material properties…” (Kirchhoff, 2009).

If the staged performativity of Site engaged the body of the performer/artist in an inter-subjective dialogue with the plywood, then Allan Kaprow’s Happenings extends this further by actively drawing the audience into the network of the piece.

Despite preceding Site by several years Kaprow’s early Happenings of the late 1950s were more “radical” in their disregard for performative conventions and less committed to formalised subject – object relations (Kelley, J. 2004).

“Kaprow had continually questioned the aesthetic conventions of framing the relationship between subject and object, the distinction between artist and audience and…” (Kelley, J. 2004, p. 34).

While in the recent rash of re-enactments of both Morris’ and Kaprow’s works have been re-enacted and videoed, only photographic documentation exists of Kaprow’s original 18 Happenings in Six Parts. As a result, much of our understanding of 18 Happenings is based on Kaprow’s extensive notes, drawings, scores… or descriptions by members of the audience.

Audience members were assigned to one of two rooms within the three-room installation in which the six sequential parts – simultaneous performances that involved eight overlapping sound tracks, ritualised movements, projected slides, spoken text and eccentric props – occurred. With unspontaneous movements lacking in emotion, performers carried out a variety of sustained choreographed tasks including playing musical instruments, striking matches, spray-painting plastic with kitchen cleaner and squeezing juice from oranges. The performance concluded with scrolls of text unfurling from the ceiling and performers walking out in single file (Kelly, 2004).

Kaprow_18H_GRY While such descriptions provide a sense of the experience, what is more important here than the specific actions are the structural implications of the work in regards to the role of the audience.




Kaprow_18H_GRY_InstalDeveloping out of Action Painting, in particular the work of Jackson Pollock (Kaprow, 1958), Kaprow’s Happenings attempted to generate an environment that immersed the viewer inside the work, not just by putting them inside the performative space but by making them active agents in the work through tightly prescribed instructions that – in the case of 18 Happenings, fragmented narrative by breaking the audience up, moving them around and creating ambiguous “free” time within the work (Rodenbeck, 2011).

“Being inside one was like being inside an abstract painting” (Kelley, J. 2004, p. 20).

thumbs_3047-02This score with its sparse instructions is commonly seen as a precursor to later development of interactive art works. Although it is initially hard to see the audience as participants in the manner we accept or even expect today, the invitation to the audience to “consciously insert themselves”[2] (Rosenthal, 2007) into the works undoubtedly informs our understanding of the idea of interaction as a breaking down of the audience and artwork hierarchy. As Noah Wardrip-Frui and many others have observed:

“The ‘Happenings’ are a touchstone for nearly every discussion of new media as it relates to interactivity in art” (Wardrip-Frui, 2003).

More than simply providing a precedent for current approaches to interactivity, these early works also highlight inter-action as a means of separating the digital from representational media. As Soke Dinkal expresses it in direct reference to Kaprow:

“The widespread judgment that interactive intercourse with computer systems prepares the ground for an emancipation from the media context, via the development from ‘passive’ to ‘active’ reception, is being euphorically defended by referring to the participatory art of the sixties” (Dinkal, 1996)

What we have in Happening’s vision of interaction is not simply the prospect of a singularity of subjects that co-constitutes materiality as with Morris, but a further liberation of subjects from representation.

I am not proposing Happenings as a means of accessing the digital but rather suggesting that their strategy of collapsing audience and artist relata, as an extension of the performative engagement with objects found in Morris’ work, suggests the digital might also be realised in a co-constituted materiality between two human agents as much as between human and non-human agents.

The coding of Kaprow’s audience via a score, to carry out a series of scheduled tasks is a strategy repeated in iForm – where participants were given a set of rules to structure their actions within a variable environment.

CHARLTON_JAMES_iform_SMProgrammed to perform a set of functions, ten participants each with iPhones were dropped off in different locations around a circular bus route. At a designated time they opened a GPS App and started feeding geo-spatial data to a server. Their instructions were to remain on the bus until someone else from the group got on. At that point they were to catch the next bus in the opposite direction.  This was to be repeated until all participants reached a designated bus stop. The performance lasted several hours. From the GPS data, a three-dimensional form was derived from the distances between participants rather than geo-spatial location. The resulting form was 3D printed and exhibited. Like Kaprow’s performers and audience, the participants in iForm were carrying out nonmatrixed actions though which they blindly assembled a concrete form.

”If a nonmatrixed performer in a Happening does not have to function in an imaginary time and place created primarily in his own mind, if he does not have to respond to often imaginary stimuli in terms of alien and artificial personalities, if he is not expected either to project the subrational and unconscious elements in the character he is playing or to inflect and colour the ideas implicit in his words and actions, what is required of him? Only the execution of a generally simple and undemanding act… The performer merely embodies and makes concrete the idea” (Kirby, 1995).

Conforming to their instructions, iForm participants were isolated from both each other and the software constructing the form. Their structural function within the work is discrete – self-contained and digital in a way that parallels both the compartmentalised structure and likely experience of the audience in 18 Happenings (Kirby,1995).

Broken into parts both temporarily and spatially, the audience experience was likely one of discontinuity in which it was impossible to perceive the whole of the work. Divided as they were across three spaces and distracted by multiple events, it is unlikely that any two people witnessed the same thing.

What I propose is occurring in 18 Happenings, then, is an emergence of a digital structural method that is a function of both a shared agency and fragmented isolation that relocates the individual at the spatiotemporal material centre of the work. What we have is not one continuous material but multiple co-constituted materialities all of which are inter-connected in the relational network of the piece.

While at first this seems contradictory in the sense that I am claiming both a continuous singularity and discrete individuality within the work, this is not at all problematic when we accept this as a state of the work rather than the participants. The work can be split across multiple sites, spaces and times that operate independently and at the same time function as a whole.


What is it then that constitutes materiality in these works, and how might this analysis assist in engaging the digital as a material within sculptural practice?

Materiality has been presented not as a corporeal property of a subject but as a materiality of intent that denies representation and is located within an exchange between co-dependent actants. The digital has been articulated as a structural method that governs relations within a network. Thus any efforts to engage digital materiality within sculptural practice should be focused on identifying operations that, like Morris’ performative actions and Kaprow’s scored events, are historical precedents for methods of interrogating materiality.

That the digital for the moment remains hidden behind representational interfaces points to the need to develop specific actions and processes that operate within that structural method in order to rematerialize the digital within sculptural practice.

[1] These works are both from early formative stages of the artist’s practice and have the advantage of being more conceptually “open works“ (Eco, 1962). Although Morris stopped doing performance works and moved on towards objects-based work, the significance and origins of his interest in process are clearer in Site and Neo Classic. Kapprow’s later happening became somewhat diluted by the influence of more theatric strategies and the role of the audience diminished.

[2] “–invitations to the event said ‘you will become part of the happenings; you will simultaneously experience them’.” (Tate. 2013).



Bolt, B (2004). Art beyond representation. London: I.B. Tauris.

Dinkla, S (2004). From Participation to Interaction: Toward the Origins of Interactive Art. Tauris.

Kaprow, A and Kelley, J (1993). Essays on the blurring of art and life. Berkeley: University of California Press.

Kaprow, A, Rosen, B, Unterdörfer, M, Meyer-Hermann, E, Rosenthal, S and Lepecki, A (2007). Allan Kaprow. Göttingen: Steidl Hauser & Wirth.

Kelley, J and Kaprow, A (2004). Childsplay. Berkeley: University of California Press.

Kirby, M (1995). Happenings. In: Sandford, M eds. (1995). Happenings and other acts. London: Routedge.

Latour, B (1988). The pasteurization of France. Cambridge, Mass.: Harvard University Press.

Leonardi, P. (2010). Digital Materiality? How Artefacts without Matter, Matter. First Monday, 15(6).

Lewis, D (1971). Analog and Digital. Nous, Vol 5, No 3 (Sept., 1971), pp. 321-327. Wiley. Retrieved from

Lillemose, J Conceptualising Materiality – art from the dematerialization of the object to the condition of immateriality. Argos festival, October 2005.[O]. Available:

Lippard, L (1968). The dematerialization of the art object. In: Alberro, A eds. (1999). Conceptual art: A Critical Anthology. Cambridge: MIT Press.

Morris, R (1970) “Some Notes on the Phenomenology of Making: The Search for the Motivated,” Artforum VIII no. 8, April 1970, pp. 62–66.

Rodenbeck, Judith F (2011). Radical prototypes : Allan Kaprow and the invention of happenings. Cambridge, Mass. : MIT Press.

Sandford, M (1995). Happenings and other acts. London: Routledge. (2013). Performance Art 101: The Happening, Allan Kaprow | Tate. [online] Retrieved from: [Accessed: 18 Sep 2013].

Wardrip-Fruin, N and Montfort, N (2003). The NewMediaReader. Cambridge, Mass.: MIT Press.: Routledge.

Object-disoriented Sound

Object-disoriented Sound: Listening in the Post-digital Culture

Notes, musings, refractions 

Budhaditya Chattopadhyay


For some time I have been deeply concerned about the mindfulness of listening and the subjective ramification of auditory perception. The thoughts that envelop these concerns essentially stem out of the questions of perpetual mobility and nomadism perhaps symptomatic of the post-digital culture. A nomadic listener is affected by a fleeting sound appearing and diminishing in the way that triggers an amorphous stream of subjective contemplation and thoughts bordering on the immediate known-ness of the sonic phenomenon, but at once moving toward the realm of unknown.

What is the ‘unknown’ embedded in a sonic phenomenon? A specific sound creates a specific listening state for the listener, who instead of deciphering the objective meaning, location-specific identity and other spatial information, may take the phenomenon as a premise or entryway to a world hitherto unknown to him/her, however vaguely related to the imagined, remembered experience of various amorphous moods triggered by the temporality, if not the characteristic texture and tonality of the sound. Today’s wind may not sound like mere wind, and the lonely screeching of the windowpane may not sound like mere friction between glass and wood, but they may sound something more abstract in the sense that it is generating memories and imagination of other realities parallel to the immediacy of sonic event.

These sounds, as impermanent they appear to the ears of a wandering listener, may open hidden doors and obscure openings for further perceptual meanderings in the realm of contemplation and thoughts transcending beyond the epistemic knowledge-based identity that the sound would otherwise objectify. The epistemological illogic and ontological void created by such object-disoriented sonic explosion, which the ancient Indian philosophers would call as ‘sphōta’ (Barlingay, 2007), is the specific area of praxis in my current ‘post-digital’ research. In order to interpret the provocative term of ‘post-digital’ in my own understanding I would like to underscore the extensive and ever-growing nomadism of agents attuned to the psychogeographic evocation of physical locations and corporeal places. These wanderings substantially contribute to an emergent culture of primarily mobile and itinerant beings engaged in the liberated ebb and flow of events, phenomena and ephemera, which operate arguably beyond the digital essentialism. The essentialism in digital revolution, which was the predominant theme of the late 90’s and early part of this millennium, starts to dissolve into an ever-growing field of intangible data and immoderate information while Nicholas Negroponte aptly proclaims: “The digital revolution is over” (1998). And along the line comes a sense of saturation across the prevailing digital divide between already digital and rapidly digitized contents.

During this process, digital media was turning our world from a textual one to audiovisual one. In this rapidly emerging audiovisual environment, we found that different forms of older media, such as recorded sound and other sound contents, were constantly moving, being relocated, reinterpreted, and engaged in conflict with the globally dispersed digital media within an imminent convergent culture. These sound contents could be as varied as archival sound recordings, clips of music and songs, spoken words, environmental field recordings, and electro-acoustic samples. We could observe a certain movement of these sound contents from a localized state (creative/productive end) to a globalized state (consumptive end) and vice versa. For example, a piece of field recording was digitally mediated so as to be considered a work of sound art, or a ‘traditional’ song from one part of the world was transmitted via the internet to another part of the world as a ‘folk’ song. The question was whether a ‘fluid-local’ sound element was losing its characteristics or retaining its identity over the course of a ‘hyper-global’ shift. We could also ask how such locative sound elements were received and interpreted in the widest end of a rather volatile audience reception within the dispersing digital media and an establishing e-commerce. Central here was the ongoing dialogue between older sound contents from primarily locative analogue source and the digitally generated ephemeral travelling sounds, whereby rapid digitization was rendering the interpretation of older sound contents as sonic artifacts. These phenomena contributed to the emerging ‘post-digital’ discourse by considering the sonic artifacts as displaced, relocated, and transformed to dissolve the digital divide between already digital and rapidly digitized contents and reinterpreted as a ‘background’ (Ihde, 1976) or elusive field of data.

Once this saturation is reached, Kim Cascone argues that in the domain of sound art and experimental music, “the medium of digital technology holds less fascination for composers in and of itself” (Cascone, 2002). In deciphering the term ‘post-digital aesthetics’ in relation to experimental music, he speaks of the ‘failure’ of digital technology and the way that triggers subversive practices with glitches, clippings, aliasing, distortion etc. I however perceive this as a failure of a pervasive digital media/technology to identify, structure and archive the transient and elusive sound field from the nameless, placeless and faceless background world of ‘data’. In this world of big data, all sounds essentially lose their locative characters, normative structure (digital, analogue or digitized), ontological source identities and epistemic knowledge-based object-hood. Admittedly, at this stage my motivation lies in delving into the question of sound’s such object-disoriented behavior upon transient listening.

In his seminal writings, for example in the famous article ‘Aural Object’, film-sound scholar and an early phenomenologist Christian Metz has expressed serious doubt about the object-specificity of sonic phenomena in scholarly thinking. He instead focused on the ‘characteristics’ of sound, and wanted to emphasize on the problematic of locating sound’s object-oriented or location-specific source. He stated, “Spatial anchoring of aural events is much more vague and uncertain than that of visual events” (Metz, 1980). Likewise, whenever a sound is digitally registered, it is mediated. Digitization causes sounds to dislocate from their original sources, turning them into discreet data on the nebulous digital media environment as discussed above. In classical sound studies (Rick Altman et al) scholars have already underpinned the issue of sound’s problematic relation to its object or source: “(…) sound is not actualized until it reaches the ear of the hearer, which translates molecular movement into the sensation of sound” (Altman, 1992). Altman speaks here of a sound event as defining the trajectory of the essential production and subsequent reception of a sound element. Its narrative, as Altman terms it, is hypothetically bound to the source that produces it. This source, the sounding object when producing sound, is spatially defined or connected to a place. These spatial sources of sound are by definition localized, but are not rendered until and unless carried by a medium to reach the point of reception. Therefore, sound contents are only recognized at different stages of digitization toward reaching a saturation state of an assumed ‘post-digital’ economy/ecology, whereby it is freed from the object. Thus, sound, by its very nature, implies mobility and subsequent object-disorientation in order to establish its recognition in the ‘post-digital’ domain. But the process of interpretation is more complex than it appears at its perceptual level of reception.

Sound seems ‘less esoteric’ in the post-digital culture because of our “newfound comfort with the immaterial world of pure data and information flowing through the cyberspace” (Dayal, 2013). The contemporary media environment allows for the separation of sounds from their locations and facilitates their travel across hyper-dispersed networks as background noise. A sound that is disembodied from its locational specificity causes multiple layers of mediation across its multiple receptions and interpretations outside of place, time, and context, whether this be in an audio streaming network on the Internet, a digital sound composition published on a net-label, or exhibited within the augmented space of an interactive installation work. In an interactive art piece, identification of a nomadic sound event can be understood through its interpretation as a fertile auditory situation. The post-digital discourse essentially relates to perpetual transience of these amorphous situations (Chattopadhyay, 2013). It is evident that, in this constant flow, the production and reception of sound over greater mobility and interactivity leads to its interpretation as itinerant auditory situations, which is a transformation of the original sound, ready to be re-interpreted to create a sense of cultural context within the post-digital milieu.

At this juncture, a nomadic listener floating across the post-digital milieu, may interact with the background noise or the unknowledgeable sounds of nameless, placeless and faceless sonic states, may sensitize his or her ears to the pseudo-object of these sounds, and are able to deconstruct them into their listening selves by their evocative capacity toward a sonic explosion as streams of timeless reverie, rumination and musings. The ‘unknown’ embedded in the wandering shadows of sounds are explored and given a (con)text by the nomadic listener’s intervention into their appearing and diminishing leaving object-disoriented states of feelings or moods.

Let us indulge into further philosophical musings triggered by listening to sounds in the post-digital milieu, and attend to what John Cage claims as mindful: “[S]ilence is not acoustic. It is a change of mind”. This will require us to set aside ‘epistemic’ issues of recognizing the source or ‘object’ of sound, and instead, focus on the subjective and inward perception of sound within the ‘self’ or ‘mindfulness’ of the listener. Following this methodology, we can examine the way memory, imagination, and personal experience of the nomadic listener alters the character of sound. Taking point of departure from the epistemological basis of object-oriented sound, in this paper I introduce this alternative methodology of listening in the post-digital culture, which I term as ‘hyper-listening’. Addressing a practice-based approach, I explore my ongoing project ‘Doors of Nothingness’ (2012-) and a series of upcoming sound installation/interventions ‘Mind your own dizziness’ (2014-) that incorporate the concept of ‘hyper-listening’, meaning that I intend to relate to the higher-level/psychic pre/post-cognitive processes triggered by the object-disoriented sounds into creating thought-provoking auditory situations. This method perhaps operates on the fringe of what artist Yolande Harris explains in her doctoral thesis: “To create situations where sound can affect and activate people’s experiences in a personal way”. The works rely on intuitiveness in listening rather than the reasoning involved in deciphering meaning of ‘aural objects’. The strong belief in inward contemplation, subjectivity and an enhanced ‘selfhood’ available to a wandering listener (because of his/her ability to free the ears from object-specificity, be they spatial, temporal, or locative) the project on one hand can explore the personal or private nature of listening; on the other, it tries to engage with the emergent sonic practices in the implicit post-digital culture.



Altman, Rick (1992). Sound Theory/Sound Practice. New York: Routledge.

Barlingay, Surendra Sheodas (2007). A Modern Introduction to Indian Aesthetic Theory: The Development from Bharata to Jagannåatha. New Delhi: D. K. Print World.

Cascone, Kim (2002). “The Aesthetics Of Failure: ‘Post-Digital’ Tendencies in Contemporary Computer Music”. In Computer Music Journal 24:4, Winter 2002 (MIT Press). (Retrieved on 15th September, 2013)

Chattopadhyay, Budhaditya (2013). “Auditory Situations: Notes from Nowhere”. In Journal of Sonic Studies. Issue 4. Special Issue, Sonic Epistemologies.

Chattopadhyay, Budhaditya (2012). “Doors of Nothingness.” In jərˈmān (English Department, University of Montana), June edition.ərˈmandoors-of-nothingness-by budhaditya-chattopadhyay/ (Retrieved on 1 August 2012)

Dayal, Geeta (2013). “Sound art”. In her blog. (Retrieved on 1 August 2013)

Harris, Yolande (2011). Scorescapes: On Sound, Environment and Sonic Consciousness. PhD thesis, Academy for Creative and Performing Arts, Faculty of Humanities, Leiden University. (Retrieved on 1 August 2013)

Metz, Christian (1980). “Aural Objects,” trans. Georgia Gurrieri. In Yale French Studies 60. (pp. 24-32).


About posts and comments

All participants (speakers and non-speakers) submit a blog post of 2000 words – deadline 29.9.2013 – and insert a link from their name on the programme page to the blog post.

Prior to the workshop, all participants are invited to respond and comment.

NB! You must be logged in to comment

At the event, participants in the day program are invited to give a 10 minute introduction speech, followed by 20 min conversation.


Welcome to the site for the upcoming Post-digital research conference/workshop to be held at Kunshal Aarhus, 7-9 October 2013. The site contains draft texts as part of a peer review process.

Venue: Kunsthal Aarhus, J.M. Mørks Gade 13, 8000 Aarhus C, Denmark.

Draft schedule:

MONDAY 7th Oct
09.00 Welcome and introductions
09.30 Anne Wistrup
10.00 Kieran Nolan
10.30 James Charlton
11.00 break
11.30 Winnie Soon
12.00 Magnus Lawrie
12.30 lunch
13.30 Annet Dekker
14.00 Eric Snodgrass
14.30 Budhaditya Chattopadhyay
15.00 break
15.30 Robert Jackson
16.00 Jamie Allen + David Gauthier
end 16.30
17.30 dinner
Public event:
19.00 – 20.30 Florian Cramer, Alessandro Ludovico

Other participants include: Christian Ulrik Andersen, Morten Breinbjerg, Geoff Cox, Jonas Fritsch, Kristoffer Gansing, Georgios Papadopoulos, Lotte Philipsen, Morten Riis, Bodil Marie Thomsen.

schedule to be added.
17.30 dinner
Public event:
19.00 – 20.30 Søren Pold, Josephine Bosma

09.30 Graham Harman talk at University.
schedule to be added.
18.00 deadline for newspaper.

Travel: First of all, there are good connections trains of course from Berlin and Hamburg. There are two airports nearby (Aarhus and Billund) with bus services straight to the centre of the city.

Accommodation: Apart from any friends you may have, there are a number of places you might try: Air B&B is one option (–Denmark)
but other cheap hotels/hostels include or

Organised by:
Christian Andersen, Geoff Cox (Digital Aesthetics/Participatory IT Research Centre, Aarhus University) + Tatiana Bazzichelli, Kristoffer Gansing (reSource transmedial culture berlin/transmediale).