Category Archives: Uncategorized

Four Notes Towards Post-Digital Propaganda

 “Propaganda is called upon to solve problems created by technology, to play on maladjustments and to integrate the individual into a technological world” (Ellul xvii).

How might future research into digital culture approach a purported “post-digital” age? How might this be understood?

1.

A problem comes from the discourse of ‘the digital’ itself: a moniker which points towards units of Base-2 arbitrary configuration, impersonal architectures of code, massive extensions of modern communication and ruptures in post-modern identity. Terms are messy, and it has never been easy to establish a ‘post’ from something, when pre-discourse definitions continue to hang in the air. As Florian Cramer has articulated so well, ‘post-digital’ is something of a loose, ‘hedge your bets’ term, denoting a general tendency to criticise the digital revolution as a modern innovation (Cramer).

Perhaps it might be aligned with what some have dubbed “solutionism” (Morozov) or “computationalism” (Berry 129; Golumbia 8): the former critiquing a Silicon Valley-led ideology oriented towards solving liberalised problems through efficient computerised means. The latter establishing the notion (and critique thereof) that the mind is inherently computable, and everything associated with it. In both cases, digital technology is no longer just a business that privatises information, but the business of extending efficient, innovative logic to all corners of society and human knowledge, condemning everything else through a cultural logic of efficiency.

In fact, there is a good reason why ‘digital’ might as well be an synonym for ‘efficiency’. Before any consideration is assigned to digital media objects (i.e. platforms, operating systems, networks), consider the inception of ‘the digital’ inception as such: that is information theory. If information was a loose, shabby, inefficient method of vagueness specific to various mediums of communication, Claude Shannon compressed all forms of communication into a universal system with absolute mathematical precision (Shannon). Once information became digital, the conceptual leap of determined symbolic logic was set into motion, and with it, the ‘digital’ became synonymous with an ideology of effectivity. No longer would miscommunication be subject to human finitude, nor be subject to matters of distance and time, but only the limits of entropy and the matter of automating messages through the support of alternating ‘true’ or ‘false’ relay systems.

However, it would be quite difficult to envisage any ‘post-computational’ break from such discourses – and with good reason: Shannon’s breakthrough was only systematically effective through the logic of computation. So the old missed encounter goes: Shannon presupposed Alan Turing’s mathematical idea of computation to transmit digital information, and Turing presupposed Shannon’s information theory to understand what his Universal Turing Machines were actually transmitting. The basic theories of both have not changed, but the materials affording greater processing power, extensive server infrastructure and larger storage space have simply increased the means for these ideas to proliferate, irrespective of what Turing and Shannon actually thought of them (some historians even speculate that Turing may have made the link between information and entropy two years before Bell Labs did) (Good).

Thus a ‘post-digital’ reference point might encompass the historical acknowledgment of Shannon’s digital efficiency, and Turing’s logic but by the same measure, open up a space for critical reflection, and how such efficiencies have transformed not only work, life and culture but also artistic praxis and aesthetics. This is not to say that digital culture is reducibly predicated on efforts made in computer science, but instead fully acknowledges these structures and accounts for how ideologies propagate reactionary attitudes and beliefs within them, whilst restricting other alternatives which do not fit their ‘vision’. Hence, the post-digital ‘task’ set for us nowadays might consist in critiquing digital efficiency and how it has come to work against commonality, despite transforming the majority of Western infrastructure in its wake.

The purpose of these notes is to outline how computation has imparted an unwarranted effect of totalised efficiency, and to label this effect the type of description it deserves: propaganda. The fact that Shannon and Turing had multiple lunches together at Bell labs in 1943,  held conversations and exchanged ideas, but not detailed methods of cryptanalysis  (Price & Shannon) provides a nice contextual allegory for how digital informatics strategies fail to be transparent.

But in saying this, I do not mean that companies only use digital networks for propagative means (although that happens), but that the very means of computing a real concrete function is constitutively propagative. In this sense, propaganda resembles a post-digital understanding of what it means to be integrated into an ecology of efficiency, and how technical artefacts are literally enacted as propagative decisions. Digital information often deceives us into accepting its transparency, and of holding it to that account: yet in reality it does the complete opposite, with no given range of judgements available to detect manipulation from education, or persuasion from smear. It is the procedural act of interacting with someone else’s automated conceptual principles, embedding pre-determined decisions which not only generate but pre-determine ones ability to make choices about such decisions, like propaganda.

This might consist in distancing ideological definitions of false consciousness as an epistemological limit to knowing alternatives within thought, to engaging with a real programmable systems which embeds such limits concretely, withholding the means to transform them. In other words, propaganda incorporates how ‘decisional structures’ structure other decisions, either conceptually or systematically.

2.

Two years before Shannon’s famous Masters thesis, Turing published what would be a theoretical basis for computation in his 1936 paper “On Computable Numbers, with an Application to the Entscheidungsproblem.” The focus of the paper was to establish the idea of computation within a formal system of logic, which when automated would solve particular mathematical problems put into function (Turing, An Application). What is not necessarily taken into account is the mathematical context to that idea: for the foundations of mathematics were already precarious, way before Turing outlined anything in 1936. Contra the efficiency of the digital, this is a precariousness built-in to computation from its very inception: the precariousness of solving all problems in mathematics.

The key word of that paper, its key focus, was on the Entscheidungsproblem, or decision problem. Originating from David Hilbert’s mathematical school of formalism, ‘decision’ means something more rigorous than the sorts of decisions in daily life. It really means a ‘proof theory’, or how analytic problems in number theory and geometry could be formalised, and thus efficiently solved (Hilbert 3). Solving a theorem is simply finding a provable ‘winning position’ in a game. Similar to Shannon, ‘decision’ is what happens when an automated system of function is constructed in such a sufficiently complex way, that an algorithm can always ‘decide’ a binary, yes or no answer to a mathematical problem, when given an arbitrary input, in a sufficient amount of time. It does not require ingenuity, intuition or heuristic gambles, just a combination of simple consistent formal rules and a careful avoidance of contradiction.

The two key words there are ‘always’ and ‘decide’. The progressive end-game of twentieth century mathematicians who, like Hilbert, sought after a simple totalising conceptual system to decide every mathematical problem and work towards absolute knowledge. All Turing had to do was make explicit Hilbert’s implicit computational treatment of formal rules, manipulate symbol strings and automate them using an ’effective’ or “systematic method” (Turing, Solvable and Unsolvable Problems  584) encoded into a machine. This is what Turing’s thesis meant (discovered independently to Alonzo Church’s equivalent thesis (Church)): any systematic algorithm solved by a mathematical theorem can be computed by a Turing machine (Turing, An Application), or in Robin Gandy’s words, “[e]very effectively calculable function is a computable function” (Gandy).

Thus effective procedures decide problems, and they resolve puzzles providing winning positions (like theorems) in the game of functional rules and formal symbols. In Turing’s words, “a systematic procedure is just a puzzle in which there is never more than one possible move in any of the positions which arise and in which some significance is attached to the final result” (Turing, Solvable and Unsolvable Problems  590). The significance, or the winning position, becomes the crux of the matter for the decision: what puzzles or problems are to be decided? This is what formalism attempted to do: encode everything through the regime of formalised efficiency, so that all of mathematically inefficient problems are, in principle, ready to be solved. Programs are simply proofs: if it could be demonstrated mathematically, it could be automated.

In 1936, Turing had showed some complex mathematical concepts of effective procedures could simulate the functional decisions of all the other effective procedures (such as the Universal Turing Machine). Ten years later, Turing and John von Neumann would independently show how physical general purpose computers, offered the same thing and from that moment on, efficient digital decisions manifested themselves in the cultural application of physical materials. Before Shannon’s information theory offered the precision of transmitting information, Hilbert and Turing developed the structure of its transmission in the underlying regime of formal decision.

Yet, there was also a non-computational importance here, for Turing was also fascinated by what decisions couldn’t compute. His thesis was quite precise, so as to elucidate that if no mathematical problem could be proved, a computer was not of any use. In fact, the entire focus of his 1936 paper, often neglected by Silicon Valley cohorts, was to show that Hilbert’s particular decision problem could not be solved. Unlike Hilbert, Turing was not interested in using computation to solve every problem, but as a curious endeavour for surprising intuitive behaviour. The most important of all, Turing’s halting, or printing problem was influential, precisely as it was undecidable; a decision problem which couldn’t be decided.

We can all picture the halting problem, even obliquely. Picture the frustrated programmer or mathematician starting at their screen, waiting to know when an algorithm will either halt and spit out a result, or provide no answer. The computer itself has already determined the answer for us, the programmer just has to know when to give up. But this is a myth, inherited with a bias towards human knowledge, and a demented understanding of machines as infinite calculating engines, rather than concrete entities of decision. For reasons that escape word space, Turing didn’t understand the halting problem in this way: instead he understood it as a contradictory example of computational decisions failing to decide on each other, on the account that there could never be one totalising decision or effective procedure. There is no guaranteed effective procedure to decide on all the others, and any attempt to build one (or invest in a view which might help build one), either has too much investment in absolute formal reason, or it ends up with ineffective procedures.

Undecidable computation might be looked at as a dystopian counterpart against the efficiency of Shannon’s ‘digital information’ theory. A base 2 binary system of information resembling one of two possible states, whereby a system can communicate with one digit, only in virtue of the fact that there is one other digit alternative to it. Yet the perfect transmission of that information, is only subject to a system which can ‘decide’ on the digits in question, and establish a proof to calculate a success rate. If there is no mathematical proof to decide a problem, then transmitting information becomes problematic for establishing a solution.

3.

What has become clear is that our world is no longer simply accountable to human decision alone. Decisions are no longer limited to the borders of human decisions and ‘culture’ is no longer simply guided by a collective whole of social human decisions. Nor is it reducible to one harmonious ‘natural’ collective decision which prompts and pre-empts everything else. Instead we seem to exist in an ecology of decisions: or better yet decisional ecologies. Before there was ever the networked protocol (Galloway), there was the computational decision. Decision ecologies are already set up before we enter the world, implicitly coterminous with our lives: explicitly determining a quantified or bureaucratic landscape upon which an individual has limited manoeuvrability.

Decisions are not just digital, they are continuous as computers can be: yet decisions are at their most efficient when digitally transferred. Decisions are everywhere and in everything. Look around. We are constantly told by governments and states that are they making tough decisions in the face of austerity. CEOs and Directors make tough decisions for the future of their companies and ‘great’ leaders are revered for being ‘great decisive leaders’: not just making decisions quickly and effectively, but also settling issues and producing definite results.

Even the word ‘decide’, comes from the Latin origin of ‘decidere’, which means to determine something and ‘to cut off.’ Algorithms in financial trading know not of value, but of decision: whether something is marked by profit or loss. Drones know not of human ambiguity, but can only decide between kill and ignore, cutting off anything in-between. Constructing a system which decides between one of two digital values, even repeatedly, means cutting off and excluding all other possible variables, leaving a final result at the end of the encoded message. Making a decision, or building a system to decide a particular ideal or judgement must force other alternatives outside of it. Decisions are always-already embedded into the framework of digital action, always already deciding what is to be done, how it can be done or what is threatening to be done. It would make little sense to suggest that these entities ‘make decisions’ or ‘have decisions’, it would be better to say that they are decisions and ecologies are constitutively constructed by them.

The importance of neo-liberal digital transmissions are not that they become innovative, or worthy of a zeitgeist break: but that they demonstrably decide problems whose predominant significance is beneficial for self-individual efficiency and accumulation of capital. Digital efficiency is simply about the expansion of automating decisions and what sort of formalised significances must be propagated to solve social and economic problems, which creates new problems in a vicious circle.

The question can no longer simply be ‘who decides’, but now, ‘what decides?’ Is it the cafe menu board, the dinner party etiquette, the NASDAQ share price, Google Pagerank, railway network delays, unmanned combat drones, the newspaper crossword, the javascript regular expression or the differential calculus? It’s not quite right to say that algorithms rule the world, whether in algo-trading or in data capture, but the uncomfortable realisation that real entities are built to determine provable outcomes time and time again: most notably ones for cumulating profit and extracting revenue from multiple resources.

One pertinent example: consider George Dantzig’s simplex algorithm: this effective procedure (whose origins began in multidimensional geometry) can always decide solutions for large scale optimisation problems which continually affect multi-national corporations. The simplex algorithm’s proliferation and effectiveness has been critical since its first commercial application in 1952, when Abraham Charnes and William Cooper used it to decide how best to optimally blend four different petroleum products at the Gulf Oil Company (Elwes 35; Gass & Assad 79). Since then the simplex algorithm has had years of successful commercial use, deciding almost everything from bus timetables and work shift patterns to trade shares and Amazon warehouse configurations. According to the optimisation specialist Jacek Gondzio, the simplex algorithm runs at “tens, probably hundreds of thousands of calls every minute” (35), always deciding the most efficient method of extracting optimisation.

In contemporary times, nearly all decision ecologies work in this way, accompanying and facilitating neo-liberal methods of self-regulation and processing all resources through a standardised efficiency: from bureaucratic methods of formal standardisation, banal forms ready to be analysed one central system, to big-data initiatives and simple procedural methods of measurement and calculation. The technique of decision is a propagative method of embedding knowledge, optimisation and standardisation techniques in order to solve problems and an urge to solve the most unsolvable ones, including us.

Google do not build into their services an option to pay for the privilege of protecting privacy: the entire point of providing a free service which purports to improve daily life, is that it primarily benefits the interests of shareholders and extend commercial agendas. James Grimmelmann gave a heavily detailed exposition on Google’s own ‘net neutrality’ algorithms and how biased they happen to be. In short, PageRank does not simply decide relevant results, it decides visitor numbers and he concluded on this note.

With disturbing frequency, though, websites are not users’ friends. Sometimes they are, but often, the websites want visitors, and will be willing to do what it takes to grab them (Grimmelmann 458).

If the post-digital stands for the self-criticality of digitalisation already underpinning contemporary regimes of digital consumption and production, then its saliency lies in understanding the logic of decision inherent to such regimes. The reality of the post-digital, shows that machines remain curiously efficient whether we relish in cynicism or not. Such regimes of standardisation and determined results, were already ‘mistakenly built in’ to the theories which developed digital methods and means, irrespective of what computers can or cannot compute.

4.

Why then should such post-digital actors be understood as instantiations of propaganda? The familiarity of propaganda is manifestly evident in religious and political acts of ideological persuasion: brainwashing, war activity, political spin, mind control techniques, subliminal messages, political campaigns, cartoons, belief indoctrination, media bias, advertising or news reports. A definition of propaganda might follow from all of these examples: namely, the systematic social indoctrination of biased information that persuades the masses to take action on something which is neither beneficial to them, nor in their best interests: or as Peter Kenez writes, propaganda is “the attempt to transmit social and political values in the hope of affecting people’s thinking, emotions, and thereby behaviour” (Kenez 4)  Following Stanley B. Cunningham’s watered down definition, propaganda might also denote a helpful and pragmatic “shorthand statement about the quality of information transmitted and received in the twentieth century” (Cunningham 3).

But propaganda isn’t as clear as this general definition makes out: in fact what makes propaganda studies such a provoking topic is that nearly every scholar agrees that no stable definition exists. Propaganda moves beyond simple ‘manipulation’ and ‘lies’ or derogatory, jingoistic representation of an unsubtle mood – propaganda is as much about the paradox of constructing truth, and the irrational spread of emotional pleas, as well as endorsing rational reason. As the master propagandist William J. Daugherty wrote;

It is a complete delusion to think of the brilliant propagandist as being a professional liar. The brilliant propagandist […] tells the truth, or that selection of the truth which is requisite for his purpose, and tells it in such a way that the recipient does not think that he is receiving any propaganda…. (Daugherty 39).

Propaganda, like ideology works by being inherently implicit and social. In the same way that post-ideology apologists ignore their symptom, propaganda is also ignored. It isn’t to be taken as a shadowy fringe activity, blown apart by the democratising fairy-dust of ‘the Internet’. As many others have noted, the purported ‘decentralising’ power of online networks, offer new methods for propagative techniques, or ‘spinternet’ strategies, evident in China (Brady). Iran’s recent investment into video game technology only makes sense, only when you discover that 70% of Iran’s population are under 30 years of age, underscoring a suitable contemporary method of dissemination. Similarly in 2011, the New York City video game developer Kuma Games was mired in controversy when it was discovered that an alleged CIA agent, Amir Mirza Hekmati, had been recruited to make an episodic video game series intending to “change the public opinion’s mindset in the Middle East.” (Tehran Times). The game in question, Kuma\War (2006 – 2011) was a free-to-play First-Person Shooter series, delivered in episodic chunks, the format of which attempted to simulate biased re-enactments of real-life conflicts, shortly after they reached public consciousness.

Despite his unremarkable leanings towards Christian realism, Jacques Ellul famously updated propaganda’s definition as the end product of what he previously lamented as ‘technique’. Instead of viewing propaganda as a highly organised systematic strategy for extending the ideologues of peaceful warfare, he understood it as a general social phenomenon in contemporary society.

Ellul outlined two types: political and sociological propaganda: Political propaganda involves government, administrative techniques which intend to directly change the political beliefs of an intended audience. By contrast, sociological propaganda is the implicit unification of involuntary public behaviour which creates images, aesthetics, problems, stereotypes, the purpose of which aren’t explicitly direct, nor overtly militaristic. Ellul argues that sociological propaganda exists; “in advertising, in the movies (commercial and non-political films), in technology in general, in education, in the Reader’s Digest; and in social service, case work, and settlement houses” (Ellul 64). It is linked to what Ellul called “pre” or “sub-propaganda”: that is, an imperceptible persuasion, silently operating within ones “style of life” or permissible attitude (63). Faintly echoing Louis Althusser’s Ideological State Apparatuses (Althusser 182) nearly ten years prior, Ellul defines it as “the penetration of an ideology by means of its sociological context.” (63) Sociological propaganda is inadequate for decisive action, paving the way for political propaganda – its strengthened explicit cousin – once the former’s implicitness needs to be transformed into the latter’s explicitness.

In a post-digital world, such implicitness no longer gathers wartime spirits, but instead propagates a neo-liberal way of life that is individualistic, wealth driven and opinionated. Ellul’s most powerful assertion is that ‘facts’ and ‘education’ are part and parcel of the sociological propagative effect: nearly everyone faces a compelling need to be opinionated and we are all capable of judging for ourselves what decisions should be made, without at first considering the implicit landscape from which these judgements take place. One can only think of the implicit digital landscape of Twitter: the archetype for self-promotion and snippets of opinions and arguments – all taking place within Ellul’s sub-propaganda of data collection and concealment. Such methods, he warns, will have “solved the problem of man” (xviii).

But information is of relevance here, and propaganda is only effective within a social community when it offers the means to solve problems using the communicative purview of information:

Thus, information not only provides the basis for propaganda but gives propaganda the means to operate; for information actually generates the problems that propaganda exploits and for which it pretends to offer solutions. In fact, no propaganda can work until the moment when a set of facts has become a problem in the eyes of those who constitute public opinion (114).

Looking at Ellul’s quote sideways, the issue isn’t that strategies have simply adopted contemporary technology to propagate an impressionable demographic, but that information is simply always-already efficient and effective in its automation. And with that, we can look at the relationship between digital transmission and computational decision anew.

Here’s Turing again, who in his last published essay Solvable and Unsolvable Problems (1954) articulated a passing remark to the Church-Turing thesis, already outlined in his 1936 paper;

This statement is still somewhat lacking in definiteness, and will remain so [...] The statement is moreover one which one does not attempt to prove. Propaganda is more appropriate to it than proof, for its status is something between a theorem and a definition. In so far as we know a priori what is a puzzle and what is not, the statement is a theorem. In so far as we do not know what puzzles are, the statement is a definition which tells us something about what they are (Turing, Solvable and Unsolvable Problems, 188)

The statement in question not only refers to Turing’s thesis, but also alludes to the predetermined structures for how something can be effectively calculable, (Rosser) and then automated by a machine. Turing wasn’t exactly prophetic in calling it propaganda considering his contributions to cryptanalysis and intelligence. Indeed, the historical relationship between Turing’s contribution to decoding information for the Government Code and Cypher School (the forerunner of GCHQ) using developed technologies, continue to play themselves out in the ongoing NSA mass surveillance revelations (Hopkins).

Yet, why would Turing define a mathematical idea as propaganda rather than proof? He was well aware that his statement was not an effective procedure in itself, which is to say it cannot be proved – it is certainly about proofs, or how one can prove certain things in a formal system and what computational methods can decide results, but it doesn’t give us knowledge about what computational or systematic procedures are. The statement only tells us that automated machines can decide the same winning conditions through equivalent algorithmic methods. The statement or thesis does not tell us why computation might be able to solve problems at all – moreover it can’t even tell us whether a problem can be decided, before one even attempts to find a solution. There is no effective procedure to ‘decide’ every effective procedure, as per the halting problem. Thus following Turing, there is no ‘correct’ use of using this proof for practical use. By contrast, no-one cannot dispute the resolution of a mathematical ‘proof’: for unlike science, once it is proved, by its very nature it cannot be unproved, unless an error lies at the center.

Pushing speculation to its extremes, this might be the reason why Turing understood his thesis as propaganda and not proof; formal systems certainly seem to offer effective procedures to problems, but unless a winning position is proved in advance, it can never fully justify itself in offering solutions in all cases. There is no effective procedure to guarantee a proof about what effective procedures are, and this is Turing’s propaganda: there is no guaranteed provable winning position about the reality of winning positions. There is no guaranteed calculation which calculates all other calculations. There is only propaganda.

Turing’s propaganda operates as if it can always produce idealised solutions to problems, but in its operation, must hide uncomfortable paradoxes which allow its communication to occur in the first place. In other words, there are only concrete methods of effective procedure which unavoidably propagate the view that all problems can be totally solved in advance.

For what is computation if it isn’t the technical means of enacting effective, efficient, propagated pre-determined results through societal means? What if the machine was the propagandist? Frederic Charles Bartlett argued that propaganda was primarily a decisive method of suggestion, not simply designed to control psychological behaviour, but to acquire specific, effective results through purposeful action (Bartlett). Perhaps we could add to this, the deeper realisation that propaganda is no longer limited to the limits of psychological behaviour, or the limits of societal communities, but extends to the limits of decisional machines which decide results in an infrastructure.

Perhaps a post-digital culture might address newer forms of propaganda emergent in computational culture: not posters, pamphlets, zines and broadcasts, but also, gamification, platform devices, spy-ware, apparatuses, services and subscriptions: each one only allowing certain pre-determined outcomes to be realised, each one already deciding (or propagating), a limited number of routes, which users mistake for their own ‘openness’. If there is one thing Silicon Valley would love to solve, in their self-congratulatory wallowing, it is detecting whether a certain problem always has a solution: and whenever they come up with one, it usually has a market to satisfy and a propagative strategy to make it seem beneficial.

Digital information in a post-digital ecology doesn’t seem to want to be free (Polk), or at the very least, it doesn’t want to look like it is: rather digital information simply wants to propagate itself as a watchdog for any problems that are always-already resolved, refusing its own transparency in turn. The best we can hope for is to understand information’s propagative effect, and ask not of its truth, but of what it propagates. Following Orwell, we should admit that as far as digital innovation is concerned, “[a]ll propaganda is lies, even when one is telling the truth. I don’t think this matters so long as one knows what one is doing, and why” (Orwell, Davidson & Angus 229).

 

—————

Sources.

Althusser, Louis. “Ideology and Ideological State Apparatuses (Notes Towards an Investigation)”. In Lenin and Philosophy and Other Essays. Translated by Ben Brewster. New York: Monthly Review Press. 1971. pp. 127-186. Print.

Bartlett, F. C. Political Propaganda. Cambridge: Cambridge University Press. 1940. Print.

Berry, David. M. The Philosophy of Software: Code and Mediation in the Digital Age. London: Palgrave Macmillan. 2011. Print.

Brady, Anne-Marie. Marketing Dictatorship: Propaganda and Thought Work in Contemporary China. Lanham MD: Rowman & Littlefield. 2008. Print.

Church, Alonzo. “An unsolvable problem of elementary number theory”. American Journal of Mathematics. Vol 58. 1936. pp. 345–363. Print.

Cramer, Florian. “Post-digital: a term that sucks but is useful (draft 2).” Post-digital Research. Kunsthal Aarhus. Oct. 7-9, 2013. Web. <http://post-digital.projects.cavi.dk/?p=295>

Cunningham, Stanley. B. The Idea of Propaganda: A Reconstruction. Praeger: Westport, 2002. Print.

Daugherty, William. J. “The creed of a modern propagandist”. In A Psychological Warfare Casebook. Edited by William. J. Daugherty and Morris Janowitz. Baltimore: John Hopkins University Press. 1958. Print.

Ellul, Jacques. Propaganda: The Formation of Men’s Attitudes. Trans. Konrad Kellen & Jean Lerner. New York: Random House, 1973. Print.

Ewes, Richard. “The world maker.” New Scientist. Aug. 11, 2012. pp. 33-37. Print. Also published in; Ewes, Richard. “The Algorithm that runs the world.” New Scientist. Physics & Math. Aug. 13, 2012. Web. <http://www.newscientist.com/article/mg21528771.100-the-algorithm-that-runs-the-world.html?page=1>

Galloway, Alexander. Protocol: How Control Exists After Decentralization. Cambridge: MIT Press. 2004. Print.

Gandy, Robin. “Church’s Thesis and the Principles for Mechanisms”.In The Kleene Symposium. Edited by H.J. Barwise, H.J. Keisler, and K. Kunen. North-Holland Publishing Company. 1980. pp. 123–148. Print.

Gass, Saul. I, & Assad, Arjang. A. An Annotated Timeline of Operations Research: An informal History. New York: Kluwer. 2005. Print.

Golumbia, David. The Cultural Logic of Computation. Harvard: Harvard University Press. 2009. Print

Good, Irving J. “Studies in the History of Probability and Statistics. XXXVII A. M. Turing’s Statistical Work in World War II”. Biometrika, Vol. 66: No. 2. 1979. pp. 393–396. DOI: 10.1093/biomet/66.2.393. Print.

Grimmelmann, James. “Some Skepticism About Search Neutrality”, in The Next Digital Decade: Essays On The Future of The Internet. Edited by Berin Szoka and Adam Marcus. Washington D.C: Tech Freedom. 2010. pp. 435 – 460. Print.

Hilbert, David. ‘Probleme der Grundlegung der Mathematik’ [Problems Concerning the Foundation of Mathematics]. Mathematische Annalen. Trans. Elisabeth Norcliffe. 102. (1930). 1-9. Print.

Hopkins, Nick. “From Turing to Snowden: how US-UK pact forged modern surveillance”, Guardian Online: The NSA Files: Decoded. Dec. 2, 2013. Web. <http://www.theguardian.com/world/2013/dec/02/turing-snowden-transatlantic-pact-modern-surveillance>

Kenez, Peter. The Birth of the Propaganda State: Soviet Methods of Mass Mobilization 1917 – 1929. Cambridge: Cambridge University Press. 1985. Print.

Morozov, Evgeny. To Save Everything, Click Here: Technology, Solutionism, and the Urge to Fix Problems that Don’t Exist. London: Allen Lane (Penguin). 2013. Print.

Orwell, George. Davison, Sheila & Angus, Ian. All Propaganda is Lies, 1941 – 1942. London: Random House. 2001. Print.

Polk, Wagner. R. “Information Wants to Be Free: Intellectual Property and the Mythologies of Control”. Columbia Law Review, Vol. 103, May 2003; U of Penn, Inst for Law & Econ Research Paper No. 03-22; U of Penn Law School, Public Law Working Paper No. 38. Also available in Polk, Wagner. R. “Information Wants to Be Free: Intellectual Property and the Mythologies of Control”. Print. SSRN: <http://ssrn.com/abstract=419560> <http://dx.doi.org/10.2139/ssrn.419560>

Price, Robert & Shannon, Claude. E. “Claude E. Shannon: An Interview Conducted by Robert Price”. IEEE History Center, Interview #423. 28 July, 1982. Interview (Audio file).

Rosser, John. B. “An Informal Exposition of Proofs of Gödel’s Theorem and Church’s Theorem”. The Journal of Symbolic Logic. Vol. 4, No. 2. 1939. pp. 53–60. Print.

Shannon, Claude E. A Mathematical Theory of Communication. Bell System Technical Journal, Vol. 27. 1948. pp. 379–423, 623–656. Print.

Tehran Times, “Transcript – Confessions of the arrested CIA spy aired on Iranian TV,” Political Desk, Tehran Times Website. Dec 18, 2011. Web.<http://www.tehrantimes.com/politics/93662-transcript-confessions-of-the-arrested-cia-spy-aired-on-iranian-tv>

Turing, Alan.”On Computable Numbers, with an Application to the Entscheidungsproblem”, Proceedings of the London Mathematical Society, Series 2, 42 (1936-7), pp 230–265. Print.

Turing, Alan. “Solvable and Unsolvable Problems.” In The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life: Plus Secrets of Enigma. Edited by Jack Copeland, (Oxford: Oxford University Press, 2004), pp. 582-595. Print. Originally published as Turing, Alan. “Solvable and Unsolvable Problems.” Science News. No. 31. (1954) pp. 7 – 23. Print.

On Remembering a Post-Digital Future.

We have always been post-digital or at least I cannot recall a time when art wasn’t?

To claim this is surely ridiculous, as the post condition demands the prior instantiation of a digital state that purportedly did not begin until the mid 1970s[1]. Yet if, for a moment, we entertain the idea that art has always been post-digital, in what way might this make sense? How might this enable a re-reading of pre-digital practices and inform our understanding of future post-digital practice?

1.  The case for a post-digital anthrax.

In pursuing this question we should of course take note of the precedent of Latour’s We Have Never Been Modern (Latour, 1993). In its function as antecedent to the Post-Modern, Latour’s claim appears not to be susceptible to the same redundancy as that made in regard to the post-digital. The modern does not after all explicitly refer to its precedents in the way the terms post-modern or post-digital might. However, in Latour’s attempt to reconnect the social and the natural worlds by denying the distinction between nature and culture, We Have Never Been Modern operates from a similar retroactive position – a position in which the Modern assumes distinction from that which came before it. In this sense the Modern, too, was always post conditional. This is not simply a case of semantic positioning but reflects fundamental aspects of Latour’s work on irreductions in regard to discovery and prior events.

“We always state retrospectively the previous existence of something, which is then said to have been discovered” (Latour, 1988).

In as much as naming something might be considered a discovery of sorts, the post-digital has always existed just as anthrax bacillus existed before Pasteur named it. (Latour, 1988). Discovery is not creation. More than this then, naming, like discovery, works backward in time, creating that which existed before its existence was known. “Once again time does not move in one direction” (Latour, 1988).

In arguing as he has that time is a configurable control mechanism pursuant to a force of labour beyond subjective or objective perception (Latour 1996), Latour challenges an anthropocentric world view that promotes humans as the arbitrator of existence. The post-digital, like anthrax, may always have existed. It is not a state created by our observance of it or something metaphysically conjured up exclusively for our amusement. It may previously quite happily have gone about its business un-disturbed by human interest.

While the logic of a mind-independent existence is clearly viable in regard to extant entities such as anthrax, we must go one step further to accept phenomena such as the post-digital in this way. For surely a human idea cannot exist before it was thought of?

Extending Latour’s assertion that the world is comprised of relational networks formed by independent actants, Graham Harman’s Object Oriented Ontology (OOO) allows for thoughts to operate as active agents that are on an equal footing with objects (Harman, 2013). For Harman, ideas are simply objects and thus capable of existing independently of our recognition of them. Here there is a subtle but significant difference with Latour’s notion of irreduction as it affects our reading of the post-digital. Harman’s light-hearted aside that “I am a genius in something that doesn’t exist yet” (Harman, 2013) should be read not as a claiming that all ideas have been thought and are simply waiting for humans to discover them – this would suggest some universalizing aperion that Harman clearly rejects. Rather Harman’s statement should be seen as talking about the phenomena of being a genius rather than the subject of his genius. Thus it can only be in hindsight of brilliance that we declare someone to be a genius as the knowledge they have created becomes recognized. The idea of genius, like the idea of the post-digital, is like a programming variable waiting for instantiation it must be declared before it can be defined.

We must consider then the possibility that the post-digital as a recognition-independent phenomenon existed not simply before Nicholas Negroponte claimed the digital revolution to be over in 1998 (Negroponte, 1998) or Kim Cascone coined the term in 2000 (Cascone, 2000), but before the digital itself.

Indeed Cascone, in coining the term, grounds the post-digital in pre-digital practices of the early twentieth century.[2] It is, according to Cascone, this shift in focus from foreground to background – from notes to noise – which leads to the glitch in digital sound processing (Cascone, 2000). While Cascone tends to draw on historical practices as precursors to the emergence of the post-digital glitch, I want to suggest that practices such as those of John Cage and Futurists are not simple ground work for an emergent genre but are in fact recognition of an existing post-digital practice. If you like – the post-digital before the ‘discovery’ of the post-digital.

In this sense the post-digital might be far closer to Latour’s anthrax bacillus than first acknowledged. It too may have been quite happily going about its business oblivious to the accolade of critical recognition.  Further more if Cascone can find examples of the post-digital before even the digital era, the very nature of the digital must also be called into question.

2. Grounding the rabbit-hole.

 Before we chase our own post-digital rabbit-tail down a futile, rhetorical rabbit-hole, it would be sensible to ground this argument within a digital ontology in the hope that it may provide some terra firma in which to burrow.

If the digital is grounded in the material world as John Wheeler would have us believe, it should help solidify the position of the post-digital as a state of practice (Wheeler, 1990). At the bottom of Wheeler’s ontological rabbit hole is the ‘it from the bit’ (Wheeler, 1990) – the notion that every aspect of the physical world stems from a yes/no immaterial source. It from bit brings an abrupt dead-end to the rabbit hole and levels the ground by reducing the aperion that is so scorned by Harman and other Sceptical Realists, to a simple binary decision at the lowest level. There is no master plan or grand scheme; simply a 0 and 1 – a digital response in which nothingness cedes to physics through the act of observation.

This binary function is the fundamental nature of the digital that operates as a set of discrete packets of information as opposed to the analogue that adopts a smooth and continuous state. The oppositional relationship between the digital and the analogue that is the basis for Digital Philosophy’s claim that the world is ultimately finite (Miller, 2013) stems from Lewis’s mathematically grounded definitions of the digital as discrete, and the analogue as continuous forms of representation (Lewis, 1971).

Indeed the seduction of the digital era was the distinction that it drew in regards to the analogue by offering an enlightenment in which each unit was perfect and infallible – infinitely lossless re/production at all levels. The analogue, by contrast, with its lax attitude to the world was degenerate and impure.

If anything, the post-digital is a rejection of this either/or dichotomy and an acknowledgment that an epistemic agent cannot establish whether nature is analogue or digital in nature (Florridi, 2008). It simply does not follow that the world is ontologically either digital or analogue simply because it appears so.

Instead we are left with the alternative position that the perception of a discrete or continuous mode is dependent on the level of abstraction assumed by an epistemic agent. As Lucciano Florridi’s level of abstraction argument succinctly puts it, “reality can be observed as being either digital or analogue, depending on the epistemic position of the observer …  and the level of abstraction adopted” (Florridi, 2008). Drawing both on Kant’s antinomies (Kant, 1964) and Young’s interference experiment (Harrington, 2011), Florridi[3] suggests that the oppositional digital / analogue framework that Wheeler’s “its from bits” relies on, is untenable.

In refuting the distinction between the analogue and the digital, it is as if Florridi has stripped non-human agents of agency and reduced matter to an indeterminate grey mush in which the digital and the analogy are only distinguished in our perception of them. Although verging on an anthropocentric model, how, within such a framework, can we understand the nature of digital materiality that is central to our positioning of post-digital art practice?

As the digital loses its allure in the afterglow, as Transmediale’s 2014 thematic statement proposes (Transmediale, 2013), we have seen the proliferation of practices that are distinctly or inherently disinterested in the distinction between digital and analogue materiality. The digital has become simply another studio material that no longer assumes a privileged position as it vies for studio space alongside paint and plaster. Indeed the fusion of digital and analogue functions – as typified by 3D printing, robotics and sensor inclusive practices – exemplifies the untenable position of an “its from bits” argument that promotes a universal materiality.

Instead we see an engagement with materiality from the perspective of the work – a sort of conceptual-materialism that brings both analogue and digital materiality into play with each other. But how do either analogue or digital states possess materiality as non-corporeal concepts, neither being bound to a substance?

While affirming material agency, binding materiality to substance denies objects the potential of a primary role in a Latourian network and denies the idea of equity between physical and metaphysical objects that is proposed by Sceptical Realism. Instead, materiality might be treated as a non-corporeal state that is distinguished from material substance not just by a parallel etymology[4] but, as Kant suggests in his treatment of materie as differentiated from substance[5] (Kant, 1964), and Heidegger in his assertion of “thingness” that “does not lie at all in the material of which it consists, but in the void that holds it” (Heidegger, 1975). While both Kant and Heidegger support in different ways the reading of substance-independent materiality, they maintain an anthropocentric position[6] that conflicts with the flat ontology of Sceptical Realism.

It is Graham Harman again who reconciles this anthropocentric conflict in his critique of Heidegger’s Zuhandenheit – readiness-to-hand. In Harman’s theory of objects[7], objects are not ontologically exhausted by human perception. They remain independent and able to enter into a non-human Latourian network. If materiality is neither a default state of substance nor an attribute of human perception, the very idea of materiality seems doubtful unless we allow for a form of co-constitution that is formed by the relata between objects.

It is precisely this co-dependent dynamic between human and non-human actants that Leonardi (2010) clarifies in regard to digital-media. Arguing for a definition of materiality that is inclusive of instantiations of non-corporeal agents, Leonardi (2010) stresses the affordance of materials rather than their physical properties, stating that it is in the interaction between artefacts and humans that the materiality is constituted.

This alternative, relational definition moves materiality ‘out of the artefact’ and into the space of the interactions between people and artefacts. No matter whether those artefacts are physical or digital, their materiality is determined to a substantial degree by when, how and why they are used. These definitions imply that materiality is not a property of artefacts but a product of the relationships between artefacts and the people who produce and consume them’ (Leonardi 2010: 13).

At risk of falling into another anthropocentric stance, Leonardi fails to extend the argument to allow for a materiality constituted solely between non-human actants. Drawing again on Heidegger we can see how – in the example of the jug (Heidegger, 1975) – materiality is defined by a co-constitutional relation with the water that fills it.

Co-constituted materiality then might be thought about as an Object Orientated Philosophy form of Mearleau-Ponty’s ‘intentional-arc’ in which the object extends beyond itself while remaining within itself. To reinterpret Young’s reading of Mearleau-Ponty:  Co-constituted objects such as materiality thus loop through objects, loop though objects and the world and loop through the objects and the virtual world (Young, 2011).

It is the ability of the co-constituted object to overreach itself while remaining embodied, to transcend subjectivity by entering into a relational schema, that emerges as a method by which materiality is actualised. Materiality is both an independent object – in an OOO sense – and an object that is dependent on the structural method of the actant network that realises it. Of course this definition of materiality as a structural method applies equally to both analogue and digital modes. In fact, it is these continuous and discrete states that constitute the underlying structural methods, which ultimately underpin materiality.

 3. The life of Zoog – a Post-Proposition.

 The central role of structural method in materiality is played out in the more than confusing linguistic parallels between Object Oriented Programming (OOP)[8] and Object Oriented Ontology (OOO). As a core feature of the OOP, the nature of the object as an abstract concept has clear parallels to the nature of physical objects, to the extent that in many introductory OOP texts the first object class named is a Person, Car or, as is the case with Daniel Shiffman, a Zoog – a ‘Processing-born being’ (Shiffman, 2008). Shiffman’s Zoog, like a person, has a childhood, must learn to walk and eventually reproduce through the programmed Variables, Conditionals and Functions that define it.

Object Oriented Programming’s use of concepts like object, inheritance and encapsulation are more that metaphorical aids. They are indicative of the interconnectedness of physical and technological digital materiality that grounds the digital in a material structural method well before Kim Cascone’s work on The Aesthetics of Failure recognised post-digital disillusionment.

‘Object oriented methodology with a promise “… everything in life is an object” seemed more like commonsense even before it was proven to be meaningful’ (Mehta, 2012).

It is no surprise then that OOP terminology emerged at MIT in the early 1960s[9] at precisely the time when Lucy Lippard’s ‘ultra-conceptual’ artists were dematerialising the art object and rethinking materiality. As Jacob Lillemose explains, Lippard’s dematerialisation of art as an object is not an argument for the disappearance of materiality but a rethinking of materiality in conceptual terms (Lillemose, 2008). When Lippard describes conceptual art as having emerged from two directions – “art as idea and art as action” (Lippard, 1973) – she failed to recognise that an action can be an idea, and thus the misnomer that conceptual art is not concerned with materiality doesn’t hold.[10]

‘[I]nstead of understanding dematerialization as a negation or dismissal of materiality as such, it can be comprehended as an extensive and fundamental rethinking of the multiplicity of materiality beyond its connection to the entity of the object’ (Lillemose, 2008).

Meanwhile around the same time in MIT computer labs OOP was attempting to make sense of dematerialised objects by establishing a programming structure grounded in material objects. While I accept the argument that, like most metaphorical terms, OOP’s object analogy now wears thin through over use (Ewert, 2012), I also assert that OOP’s ability to model the world is less significant than its ability to inform the world about its own material state. In developing a programming language grounded in object metaphor, OOP reflected back to us something new about the state of the material world – the structural methods that underpin objects.

While we can thus see both the development of OOP and the dematerialisation of art as symptomatic of a broader desire to re-engage with materiality[11], seminal conceptual art works such as Alan Kaprow’s 18 Happenings in Six Parts, 1959,[12] deepen the connection by engaging systems that are clearly aligned to digital structural methods[13].

Kaprow’s Happenings generated an environment that immersed the viewer inside the work, not just by putting them inside the performative space but by making them active agents in the work through tightly prescribed instructions that – in the case of 18 Happenings in Six Parts, fragmented narrative by breaking the audience up, moving them around and creating ambiguous ‘free’ time within the work (Rodenbeck, 2011).

Kaprow can be seen as effectively treating both human (performers and audience) and non-human objects as programmable units that execute simple ‘non-matrixed’ actions that embody and make the idea concrete (Kirby, 1995). Their function as programmable objects within the work is discrete and autonomous. Each actant is performing a task that is self-contained and digital in a way that parallels methods of encapsulation and instantiation in OOP.

What I propose is occurring in 18 Happenings in Six Parts, then, is an instance of a digital structural method that is a function of both a shared agency and a fragmented isolation that relocates the individual at the spatiotemporal centre of the materiality that is the work. What we have is not one continuous material but multiple co-constituted materialities all of which are inter-connected in the relational network of the piece.

In illustrating the ability of non-technological practices to realise a digital materiality by operating through a digital structural method, the work liberates the digital from technology and from the specific delineators of the digital era. The digital is no longer the exclusive domain of the computer. It is a material state defined by a structural method. The potential for the digital to exist prior to the advent of digital technology re-positions not only the digital but also the post-digital that might now be considered as more than simply a refutation of digital technologies.

 The idea that art has always been post-digital now seems less ludicrous not simply because the digital has been shown as an enduring material state but because of the parallels between post-digital disillusionment and an unbounded digital materiality.

The post-digital’s disinterest in the distinction between digital and analogue materiality is a levelling of the material playing field so that any distinction between them is no longer the definitive factor. Both are objects not as form but as method. In an ironic twist, the promises of a digital immateriality made by technology have instead found reality in the co-constituted interactions of human and non-human agents as material methods.

As a structural method the digital is not dependent on the technological constructs of the digital era that it is commonly associated with. The body – perhaps the most analogue of all objects – has been shown, through the example of Kaprow’s work, as capable of constructing a co-constituted digital structure, thus chronologically freeing the digital from specific media histories. In this sense “the digital” predates the development of digital-technologies, rather than being a condition determined by it.

5. After the coup?

If a new materiality in the guise of the post-digital has risen up and overthrown the governance of technologies that have for so long appeared to dictate its condition, what comes next? Is the new regime as susceptible to corruption as the old, or are we witnessing some new world order?

If the digital afterglow attempts to find anything, it is not a new pathway in the wasteland of the digital aftermath (Transmediale, 2014), but the retracing of a pathway that appeared long buried in the plethora of digital gadgetry that litters the material landscape.

There is nothing new about the post-digital, at least not in the sense of it being chronologically tethered to the digital era. Rather, the post-digital is a renewed interest in the materiality of the world that includes digital materiality. It is the epiphany that the digital as a structural method was a material long before the first 8-bit string.

The rethinking of digital practices as proposed by the post-digital is not really that radical after all, then. While it may be that the so-called post-digital is a symptom of resistance to the commodification of digital culture, it is not simply a nostalgic yearning for the Jurassic technologies as proposed by Andersen and Plod (2013). The post-digital might instead be considered as a neo-material state in which the materiality of “objects” is better understood not as a physical condition but in non-corporeal terms as a relational structural method.

Although neo-materialism in its Marxist positioning of human subjects as objects of labour (Simon, 2013) shares much in common with the post-digital’s rejection of the technological object, my use of the term here is in regard to the materiality of the digital and the post-digital. In this way, the post-digital is an affirmation of the significance of method rather than form in materiality in a way that is not only compatible with a neo-material positioning of labour relations but a further affirmation of the relevance of Sceptical Realism non-anthropocentric positioning of objects in regard to materiality.

Whatever we call this rediscovered state of materiality that is emerging as post-digital, it is not a cybernetic post-human fusion of the co-constituted technological flesh in which the digital is grafted onto the body to realise a new materiality. (Mitchell, 2004).

Even if the neo-material body turns out to be digital after all, as it might conceivably do once we accept materiality as structural method, this is not a wetware art dream in which we find out that the body has always been digital. Far from being a dream, though, the so-called post-digital has simply woken us up to what other non-human objects knew all along.

Art has always been post-digital; we are only now remembering that it is.

 

 

 

 

 

 

 

 

Bibliography.

 

Andersen, C. U., and S. B. Pold. “Ten Theses on Cassette Tapes, History, and Interface Criticism.” Web log post. Post-digital Research. CABI Aarhus University, 30 Sept. 2013. Web. 4 Oct. 2013.

 

Bolt, Barbara. Art beyond Representation: The Performative Power of the Image. London: I.B. Tauris, 2004. PDF.

 

Cascone, Kim. “The Aesthetics of Failure: “Post-Digital” Tendencies in Contemporary Computer Music.” Computer Music Journal 24.4 (2000): 12-18. Print.

 

Dipan, M. “I Think It Was the Churn of Software Projects Prior to OO Days. OO Helped by Adding the Fundamentally Critical Concept – Model the Real World .” Web log comment. Http://programmers.stackexchange.com. Stack Exchange, 1 Mar. 2012. Web. 18 Nov. 2013.

 

Dombrowski, D. A. “Heidegger’s Anti-Anthropocentrism.” Between Species Winter & Spring (1994): 26+. Http://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1871&context=bts. Cal-Poly. Web. 26 Nov. 2013.

 

Ewert, Winston. “Does Object Oriented Programming Really Model The Real World?” Web log post. Programmers. Stack Exchange, 2 Mar. 2012. Web. 2 Dec. 2013.

 

Harman, Graham. “Materialism Is Not the Solution.” AIAS Guest Lecture. AIAS Auditorium, Aarhus. 09 Oct. 2013. Lecture.

 

Harrington, Bill. “Thomas Young’s Double Slit Experiment.” MIT Video. MITvideo, 26 Oct. 2011. Web. 02 Dec. 2013. <http://video.mit.edu/watch/thomas-youngs-double-slit-experiment-8432/>.

 

Heidegger, Martin, and Albert Hofstadter. Poetry, Language, Thought. New York: Harper & Row, 1975. Print.

 

Floridi, L. (2004). Informational Realism, in Proc. Selected Papers from the Computers and Philosophy Conference (CAP2003), Canberra, Australia. CRPIT, 37. Weckert, J. and Al-Saggaf, Y., Eds. ACS. 7-12.

 

Kant, Immanuel, and Norman Kemp Smith. Critique of Pure Reason. London: Macmillan, 1964. Print.

 

Kirby, M. 1995. Happenings. In: Sandford, M. eds. 1995. Happenings and other acts. London: Routedge.

 

Latour, Bruno. We Have Never Been Modern. Cambridge, MA: Harvard UP, 1993. Print.

 

Latour, Bruno. The Pasteurization of France. Cambridge, MA: Harvard UP, 1988. Print.

 

Leonardi, Paul M. “Digital Materiality? How Artifacts without Matter, Matter.” First Monday 15.6-7 (2010): n. pag. Print.

 

Lewis, D. “Analog and Digital.” Nous 5 (1971): 321-27. Http://www.jstor.org/stable/2214671?origin=JSTOR-pdf. Wiley. Web. 10 Mar. 2013.

 

Lillemose, J., 2008. Conceptualizing Materiality – art from the dematerialization of the object to the condition of immateriality. [WWW Document]. Histories and Theories of Intermedia. URL http://umintermediai501.blogspot.co.nz/2008/01/conceptualizing-materiality-art-from.html (accessed 10.1.13).

 

Lippard, Lucy. “The Dematerialization of Art.” Conceptual Art: A Critical Anthology. Ed. Alexander Alberro and Blake Stimson. Cambridge, MA: MIT, 1999. 46-50. Print.

 

Leonardi, Paul M. “Digital Materiality? How Artifacts without Matter, Matter.” First Monday 15.6-7 (2010): n. pag. Print.

 

Miller, D. and Fredkin, E. (2013). What is Digital Philosophy? | Digital Philosophy. [online] Retrieved from: http://www.digitalphilosophy.org/about/ [Accessed: 25 Nov 2013].

 

Mitchell, Robert, and Phillip Thurtle. Data Made Flesh: Embodying Information. New York: Routledge, 2004. Print.

 

Negroponte, N. “Beyond Digital.” Wired 12.6 (1998): n. pag. Print.

 

Rodenbeck, J.F., 2011. Radical prototypes: Allan Kaprow and the invention of happenings. MIT Press, Cambridge, Mass.

 

Shiffman, Daniel. Learning Processing: A Beginner’s Guide to Programming Images, Animation, and Interaction. Amsterdam: Morgan Kaufmann/Elsevier, 2008. Print.

 

“Transmediale 2014.” Transmediale, n.d. Web. 02 Dec. 2013. <http://www.transmediale.de/festival>.

 

Wardrip-Fruin, Noah, and Nick Montfort. The New Media Reader. Cambridge, MA: MIT, 2003. Print.

 

Wheeler, John A. “Information, Physics, Quantum: The Search for Links.” Complexity, Entropy, and the Physics of Information: The Proceedings of the 1988 Workshop on Complexity, Entropy, and the Physics of Information Held May-June, 1989, in Santa Fe, New Mexico. Complexity, Entropy, and the Physics of Information, Redwood City, CA. Redwood City, CA: Addison-Wesley Pub., 1990. 309+. Print.

 

 



[1] Although there is no definitive starting point take the release of the Apple-1 in 1976 as marking the proliferation of digital technology typified by the digital age.

[2] Cascone identifies both the Futurists and Cageian attention to noise from the 1950s as key identifiers of post-digital music.

[3] Florridi’s papers against a digital ontology lays the groundwork for Informational Structural Realism.

[4] As explained by JeeHee Hong, material and materiality are ambivalent terms that refer both to physical and non-physical matter (Hong).

[5] That the philosophical concept of substance is an a priori condition for our experience.

[6] For Heidegger, “humans are both a kind of entity and the clearing in which entities can be manifest” (Dombrowski, 1994).

[7] First laid out in Tool-Being 2002 and later developed by Levi Bryant  into Object Oriented Ontology  in 2009.

[8] OOP is a programming language organized around objects rather than actions.

[9] Although Simula 1965 is the first recognized OOP language its origins can found in MIT’s artificial intelligence group work in the late 1950’s and  Ivan Sutherland’s Sketchpad, 1963) http://www.computerhistory.org/timeline/?category=sl

[10] Lippard acknowledges the deficiencies off the term in regard to materiality of objects in the preface to Six Year: The dematerialization of the art object … (Lippard, 1973).

[11] The Counterculture movement of the 1960’s is taken as a rethinking of materiality as an idea and in action.

[12] Kaprow’s Happenings are seen as ‘a touchstone for nearly every discussion of new media as it relates to interactivity in art’ (Wardrip-Frui 2003: 1). More than simply providing a precedent for current approaches to interactivity, early works such as Kaprow’s 18 Happenings in Six Parts also highlight inter-action as an exchange in which the materiality of the work is co-constituted by independent agents.

 

[13] I fuller analysis of materiality in Kaprow’s Happenings will be included in the upcoming publication – Digital Movement: Essays in Motion Technology and Performance. Popat & Salazar.

A Dialogue on Cassette Tapes and their Memories

Tape-in leader
Have we reached an end point of the cultural history of computing? To advertise the first Macintosh computer in 1984, Apple released a famous commercial video directed by Ridley Scott. In a dystopian future, the Macintosh will save civilization from a totalitarian state with obvious references to both George Orwell’s Big Brother and allegedly also the IBM mainframe systems that were controlling the market at the time. The future will not be like Orwell’s 1984 because Apple’s computer interface will redefine what computing means. It will no longer be an interface for conformity that absorbs the worker, but an interface for individual expression and cultural taste. No doubt, the Macintosh took part in a history where computers redefined cultural consumption, communication and the arts. The computer, and not least the smart phone and tablet, has grown to become a primary medium for cultural production and consumption.

Three decades later, the table is turning. According to a leaked NSA presentation it is now Apple who is Big Brother, and enthusiastic iPhone customers who are the zombies living in a surveillance state (Rosenbach et al). In other words, the promise of a digital revolution also implies a reaction where dominant actors remain faithful to the institutions of intellectual property, as Stuart Moulthrop noted already in 1991. The imagined free world of cultural computing has turned into a business of “controlled consumption” (Striphas; Andersen and Pold “Controlled Consumption…”). To prevent piracy, software and hardware providers such as Apple, Amazon and Google have introduced a new cultural business model that involves a licensing system for cultural software and content, combined with the locking down of software into hardware and IT “appliances”. By this, the user ceases being a user. Instead of being able to use the computer, by accessing for instance the file system, the user relies on constant updates to manage the computer.  The hardware is cheap, and the licenses and updates often come for free. However, as a wise pig once said, ‘if what you eat is free, you are the product’. Cultural production becomes a kind of consumption – a matter of uploading content into the cloud, and selecting filters and other pre-configurations afforded for next to nothing by service suppliers who have a clear interest in increasing the volume of content and users on their platforms. Simultaneously, otherwise passive cultural consumption is turned into production of data of what is read, looked at, listened to, etc. (including where and by whom) that is valuable in marketing, and apparently for others too. In several cases, the providers of controlled consumption have been caught in delivering surveillance data to military, state and industry intelligence. In this way, participatory network culture has been subsumed under a strictly monopolizing business model. The computer, which was originally developed as a military technology but redefined as emancipatory and revolutionary by Apple and others, is now back again where it began: as a military intelligence technology.

What strategies of resistance and critique are left in this contemporary totalitarian digital culture? In a “post-digital” era of reaction (rather than revolution), the digital no longer seems to induce any disruption, as Florian Cramer notes (Cramer “Post-digital: A Term…”). With controlled consumption, the digital blends freely into popular culture – with no distinction between “analogue” and “digital”, “online” and “off-line”. Paradoxically, today’s disruption seems to originate in a fascination of forgotten and obsolete technologies. Also in controlled consumption “old” media in the form of for instance “polaroid” filters and square shapes on Instagram, holds fascination. However, the disruptive fascination of the obsolete seems to be of a different kind, where the distinction between “digital” and “analogue” is replaced with a distinction between “shrink-wrapped” and “Do-It-Yourself,” as Florian Cramer also notes. The fascination of vinyl records, floppy disks, pneumatic tubes, and other historical and lost materials and platforms is in this sense a reaction to the “shrink-wrapped”. Contrary to Instagram and the use of services and filters, the ethics surrounding a disruptive use of old technologies originates in a hacker ethic.

It was not only Apple that believed in a digital uprising. Also in 1984, Steven Levy published a seminal book on hackers as ‘heroes of a computer revolution’. Levy’s hacker ethics included free access to all computers and all information, mistrust to authorities as well as an insistence on beauty and art. In many ways, this ethics has always been in opposition to Apple’s ethics. When Apple believed that the digital revolution would happen through user-friendly design and aesthetical and perceptually pleasing hardware and software, hackers turned to the poetics of hardware and software, foregrounding the constructing elements. This involves both an inquiry into programming and circuit bending, and an inquiry into the social institutions that follow technologies, as described by Cornelia Solfrank in her text on women hackers. In contrast to the “good” digital revolution carried out through user-involvement in interface design in the eighties, “hacking” even developed criminal connotations (with an ignorance to the hacker ethic of respecting people’s data). Following this, when hacker/maker culture now inquires Jurassic technologies it is a different kind of inquiry than the aesthetic appreciation of Polaroid images on Instagram: it is not an inquiry into the perceptually pleasing, but an inquiry into the poetics of materials and the social constructions of media technologies.

To enlighten the critical inquiry of Jurassic technologies, we suggest following two dimensions. First of all, we ask how to perceive history? The desire for the old is not merely nostalgia for a lost aesthetics; rather, it implies an alternative view on history – the memory of the past – itself.  In this critical perspective, excavating the past is an attempt to challenge the course of events that has led to the techno-social constructions of controlled consumption and shrink-wrapped agency.  In this light, inquiring lost media technologies establishes imaginary correspondences with past practices and production modes that only exist in our memory. Secondly, we ask whose memory? On the one hand, vinyl records, cassette tapes, floppy disks and so forth are media that contain human memories as texts, sounds and images. However, on the other hand, following an inquiry into the poetics of materials and how our memories are stored through for instance phonography and magnetism, the technologies also seem to remember the humans. In other words, a reinvestment in old media is also an excavation of the materials’ own reality.

In the following, we discuss these two dimensions of a “post-digital” critique by setting up a dialogue between two compact cassettes. “Cassette A” represents how we remember cassette tapes, and how our memories of material practices reflect the subsumption of network culture by controlled consumption. ”Cassette B” represents how the cassette tape as a material remembers us. The dialogue between the two cassette tapes is based on fragile timing mechanisms – not linear, nor compatible with digital clock frequencies, they may get slightly out of sync.

I.A The Consortium for the Preservation of Cassette Tape
In the summer of 2013, The Consortium for the Preservation of Cassette Tape presented CASSETTE MEMORIES, ‘a media archaeological excavation of the cassette tape and its use – from a human and tape perspective’ (a workshop at Roskilde Festival, initiated by Andrew Prior, Morten Riis and Søren Pold in collaboration with Roskilde Libraries). The workshop explored the overlooked sound archives of cassette tapes residing in closets, second hand shops and flea markets, and invited participants to discover the material of cassette tapes by disassembling, making loops and remixing old cassette tapes. Simultaneously, the memories of their practices with playing and recording were documented on cassette recorders. Cassette tapes are deeply associated with our childhood memories of recording voices, listening to music and creating mixtapes. In this sense, the cassette tape represents our past when found in an old drawer, and brought to the workshop to be tampered with, cut up, and looped in new ways. But it is also a recollection of poor signals and incompatible noise reduction.

I.B – Cassette representation (or, the question concerning representation) 
By posing the question of how the tape recorder represent and understands the world, we have the possibility to get closer to the actual physical operational technology itself, as an exposition of length, time and magnetism and its way of representing reality. For the scientist, the tape recorder was traditionally used to document and record our the sounds of the world which then could be brought back to the lab for further analysis, focusing on the spoken or auditory content of the tape – as opposed to an investigation of how the sound of the tape itself understands its surroundings. Later digital technology made the tape recorder obsolete, but the analysis still focuses solely on the content, making the medium somewhat unimportant. But there is a different approach, in which the cassette tape recorder is transformed into an object of carpentry; a term inspired by the work of Graham Harman and developed further within the object oriented ontology of Ian Bogost.

II.A Cassette materialism
What is it that the tape records? What does it show us when brought to the workshop? In his essay “Theses on the Philosophy of History” Walter Benjamin writes: “To articulate the past historically does not mean to recognize it ‘the way it really was (Ranke).’ It means to seize hold of a memory as it flashes up at a moment of danger” (Thesis VI). It seems clear that Benjamin criticizes historicism. We cannot seize hold of the past merely by describing a level that pre-determines a logical course of events. History as ‘the way it really was’ is more ambiguous (as Benjamin’s criticism of the founder of modern, source-based history, Leopold Ranke also indicates). In his theses, Benjamin explicitly addresses historical materialism, and in continuation of this, we propose to explore of the revival of cassette tapes as a material history pointing beyond a simple revelation of material and technological determination. This implies that it is not merely the productive forces (our tools, instruments, technology, knowledge, etc.) that define our history as a changing mode of production (tribal, feudal, capitalist, etc.) in a simple one-way – techno-deterministic – direction. In other words, cassette memories are not just revelations of how social relations are most fundamentally production relations, and the essential role of the cassette tape in the making of a pro-sumer capitalist system’ (or whatever one chooses to call it). Technology, and the processing of magnetic signals did not make history and did not define our language and social relations in new ways, nor did any other technology. The technology and material production levels are always met with specific cultural interpretations and practices. Likewise, cassette tapes are used through a myriad of practices that still carry potentials.

II.B – The carpentry of cassettes
A central term for philosopher and game designer Ian Bogost, as it is unfolded in his book Alien Phenomenology from 2012, is the notion of carpentry which is described as the philosophical practice of making things. As a philosophical lab equipment (Bogost 100) carpentry becomes a perspective on creative work that poses philosophical questions, as when matter is being used especially for philosophical use (Bogost “Carpentry va. Art”), executing what could be denoted as applied ontology. This happens because writing is dangerous for philosophy because writing is only one form of being, a comment to the assumption that we relate to the world only through language (Bogost Alien Phenomenology 90). At the core of carpentry lies the understanding that philosophy is practice just as much as it is theory, the practice of constructing artefacts as a philosophical practice that is (Bogost 92). The term extends the ordinary sense of woodcraft, to include any material, and additionally it lies within Graham Harman’s philosophical sense of “the carpentry of things” (Bogost 93), a term that refers to “how things fashion one another and the world at large” (Bogost 93). But in Bogost’s terminology carpentry “entails making things that explain how things make their world” (Bogost 93), thus enabling not only theory in practice, but more over; practice as theory (Bogost 111).

The term carpentry is unfolded within a larger context of object oriented ontology or philosophy, which originates from the speculative realism of Graham Harman, Ray Brassier, Quentin Meillassoux and Iain Hamilton Grant. At its core a speculative realist is opposed to correlationism – a term used to describe that being exists only as a correlate between mind and world, placing humans at the center (Bogost 4; Harman). As an example, Heidegger claims that objects can exist outside human consciousness, but their begin exist only in human understanding (Bogost 4). Thus to be a speculative realist “one must abandon the belief that human access sits at the center of being, organizing and regulating it like an ontological watchmaker” (Bogost 5), and instead shift focus to include all possible objects, and that all things exists equally thus introducing notions of flat or tiny ontology.

Ultimately this means that when removing humans from the center of the equation more focus is directed towards the various objects that the world consists of, which for Bogost means the investigation of what it is like to be a pixel within a computer game.

III.A Cassette tape interfaces
Rather than beginning by discussing whether to prioritize the auditory signs of the recorded voices, or the signals embedded in the materiality of tapes, we suggest to enlighten the relation between the sign and the signal (see Andersen and Pold Interface Criticism).  What is a magnetic cassette tape in this perspective? Along with other productive forces and technologies, cassette tapes must be seen as part of the same realm as language, in the sense that also language turns out material (as on a cassette tape), and this material is in itself a speech act (at the workshop people talked about sending their voices to their loved ones across the Atlantic and about the investment and gesture of recording and giving away a mixtape). A qualitative separation of material signal processing and the media representation is therefore futile. In every way, the material of the cassette tape (the playback head, the noise reduction system, etc.) is as much a social and linguistic construct (including DIN and IEC defined standards and protocols for equalization), as it is the physical manifestation of a representation (of a memory, a voice, a recording). This ambiguous double-nature allows for a critique of the social and political reality of the technology.

III.B – Magnetic operations
Material that is capable of being magnetised is referred to as ferrous, and the molecules of such a material are linked together in the form of a crystal structure (Earl 21). Each complete crystal element contains a certain number of molecules, depending on the material. Ferric oxide e.g. which forms the basis of the coating of Fe tape has eight molecules per element (Earl 21). The crystal elements can be regarded as domains of randomly oriented magnetic fields, but when the material gets magnetised the domains are swung from their random distributed positions and now the domains line up. The strength of the resulting magnet is determined by the number of domains in alignment. When all the domains are in alignment the material is said to be magnetically saturated, that is, being incapable of accepting further magnetism or producing a greater magnetic field (Earl 22). The basis of which the tape recorder is capable of capturing and reproducing auditory content, is centred around three tape heads – erase, record and playback – each containing an electromagnet having the ability to convert an electrical signal into a magnetic force that can be stored on the passing magnetic tape, and conversely convert the magnetic content of the tape into electrical current.

IV.A The cassette tape as a document of barbarism
Benjamin’s thinking is an encouragement to think of the renewed interest in the cassette tape as something that flashes up in a moment of danger. The historical materialist must therefore address history differently, as Benjamin puts it: ‘There is no document of civilization, which is not at the same time a document of barbarism. […] A historical materialist therefore dissociates himself from it as far as possible. He regards it as his task to brush history against the grain.’ (Thesis VII) With no attempt to recreate a media history, CASSETTE MEMORIES recalls the lost potentials of cassette tapes in relation to a contemporary digital culture. The cassette tapes are explored as a “configuration pregnant with tensions” in order to recognize a “revolutionary chance” and “blast a specific era out of the homogeneous course of history” (Thesis XVII).

IV.B – The danger of erasing  
At the erase head, a high frequency (approximately 80 to 100 kHz), high amplitude audio signal is sent to the erase electromagnet, thus randomising the magnetic particles on the tape, and erasing any previous material on the tape. Music varies in frequency and amplitude, and so does the magnetic field from the record head that imprints the magnetic picture of the audio signal on the tape. When recorded, tape scrolls under the playback head and the moving magnetic fields induce a varying current in the head. This voltage produces an electrical representation of the magnetic signal on the tape. Subsequently, the signal is passed through an equalisation and amplification circuit so that the recorded music becomes audible in the connected speakers.

V.A The danger of techno-cultural discourse
Techno-cultural discourse leads to the belief that technology represents a history of increased efficiency, and that the conditions of present digital technologies (producing, sharing, mixing, etc.) can maximize individual freedom and social production. CASSETTE MEMORIES challenge these myths, by exploring a past discourse in the present – as a potential criticism. The return to old media holds no essence but expresses an awareness of how our material technologies are also signs, and our signs technological, and of how the coupling of signs and material by digital technology leads to a form of control.

It is not of particular interest that the cassette tape as a tool for reproduction and cultural participation has contributed to our contemporary social reality of product relations (participatory labor). What is interesting is the discourse and myths around the technologies. They have led to the belief that the employment of technology represents a god given chain of events leading to increased efficiency, and that the maxims of the technology (producing, sharing, mixing, etc.) can create individual freedom and mastery when navigating through social reality (this idea is not unlike Georgios Papadopoulos critique of a totalizing market, (21)). Such constructs cannot be addressed as material determinism, but CASSETTE MEMORIES can lead to a challenge of these myths, by exploring a past discourse in the present – as a potential criticism. In this way, the return to old media does not hold an essence. The material turn is realist, in the sense that it expresses an awareness of – not how materials are more real then signs – but of how also our technologies are signs, and our signs technological, and an awareness of how the coupling of signs and material in technology also incorporate a form of control.

V.B – The “sound on sound button” (or, the switch of carpentry)
The switch of carpentry enables a recording method, in which layers of sound becomes superimposed upon each other. This “sound on sound button” – which in CASSETTE MEMORIES was build into a modified cassette recorder – disables the erase head of the tape recorder and reconfigures the cassette machine into an object of carpentry. The button gives us the possibility to display and monitor the cassette tapes state of magnetic saturation, a state where all possible resources of the ferrous coating on the tape are used. This shows the true personality of the recording medium and its attempt to capture the complex pulsating sound waves of humans talking, walking, playing music onto the tape. The recorded sounds gradually gets more and more saturated, forcing the magnetic domains in the same direction, but still leaving room to listen to the contours of the previously recorded material, while new recordings get layered up.

VI.A Cassette tape allegories
The cassette tape does not hold a truth but is an allegory. As an allegory, the cassette tape and CASSETTE MEMORIES seizes ‘hold of a memory as it flashes up at a moment of danger.’ It establishes an imaginary correspondence to another historical moment, but not as a yearning for a lost time (to paraphrase a notion of history present in the writings of Marcel Proust). There is no radical power in looping and cutting up tapes today, but the imaginary construction represents another way of experiencing producing, sharing, mixing, etc. – as Florian Cramer characterizes post-digital strategies, it can be seen as “a form of social networking that is not controlled or data-mined by those companies [Google, Apple, Amazon, and Facebook].” (“Post-digital Writing…” 237)

VI.B – Compact cassette time
Time is a crucial factor. When recording on a compact cassette, time is measured in the length of tape  played by the tape recorder with an average speed of 4,76 cm/sec. The specific cassette recorder used in CASSETTE MEMORIES is the Philips D6260, and according to the service manual, the tape speed can vary up to 3%, making the notion of accurate time questionable.

If time is length – or, more accurately, the execution of length – then the precision of the tape recorder and the idea of an “operative tape recorder” becomes extremely important (which to a great extend references Wolfgang Ernst’s notion of micro-temporality). But things gets even more complex when using a 1 minute continuous loop cassette using sound on sound recording, as it was the case in CASSETTE MEMORIES. This method challenges the notion of documented time (seconds, hours, days, years). Time gets transferred into complex states of recorded time, real time, machine time, past time, tape time (which is the execution of tape length), creating a compound of different conceptualisations of time existing as layers on top of each other.

VII.A Contemporary interface culture
What is a contemporary interface culture? Mobile interfaces like smartphones and tablets represent a new generation of the interface, a generation that integrates earlier developments as well as – what seems to be a qualitative turn – a totalitarian controlled consumption interface coupled with a ‘war on general-purpose computing’ (Doctorow).

The first human-computer interfaces are technical control panels with switches. Often the agenda is related to automation, and the computer is used for batch processing that does not demand a user input. Later on, textual interfaces such as the command line interfaces of DOS and UNIX, make real-time interaction possible. The Macintosh in 1984 marks another moment in the history of interfaces, where the graphical user interface leaves the labs (where it was developed through the 1960s and 1970s by Ivan Sutherland, Douglas Engelbart, Alan Kay, Adele Goldberg and others). The GUI is an integral part of real-time interaction in the personal computer, and also the main object of inquiry for interface design and Human-Computer-Interaction. With the Web, and especially Web 2.0, the interface is supplemented with a communicative, networked, and social dimension. In combination with mobile interfaces and data surveillance and sensing, physical space is increasingly saturated with computation – leading to new techno-myths of a totalizing technology, exemplified in the buzz around smart cities, cloud computing, quantification of the self, gamification, big data, etc. Myths are powerful illusions that tend to shape our reality. Hence, the interface becomes ubiquitous and totalitarian – an impenetrable surface, seamlessly attached to all things and behaviours in a process of invisible immaterialisation.

VII.B – OOO, OOP, OOMT <=> micro temporal media archaeology
The self made “sound on sound switch” and the use of loop cassettes changes the tape recorder’s status from a technological object into an object of carpentry, a philosophical lab equipment used to practice philosophy. Layers of sound becomes superimposed upon each other; and furthermore, various notions of recorded time gets superimposed upon each other, making the sound on sound loop tape difficult to analyse in a traditional textual manner, forcing us to shift our analysis’ perspective towards the actual recording technology itself.

These philosophical questions posed by carpentry reveal an alternative reality of the operational tape recorder. This reality is – following the thoughts of Wolfgang Ernst – somewhat un-historical, meaning that the specific function of the machine is outside history and human discourse. However, it is not outside the discourse of cassette tape itself. The perspective is thus shifted towards  the medium itself as an operating entity   (Ernst “Towards a Media Archaeology”). Thus, a merger of object oriented ontology and media archaeology presents itself, bringing an awareness to the moment when media themselves become active “archaeologists of knowledge” (Ernst Media Archaeography 239). From a media archaeological point of view, it is only technical media that is able to register physical real signals. The cassette tape not only preserves the memory of human cultural language, but also the knowledge of how the cassette recorder stores and operate the magnetic domains of the ferrous coating of the running tape. The “carpentry” of an artistic performative context exposes the knowledge that is embodied in the operational technology and reconfigures it into a philosophical practice; meaning that it exposes the saturation of the physical material and uncovers questions regarding our understanding of documented time. In addition, such perspectives reflect the use of our current digital technologies for documenting our sounding reality, by stressing the importance of paying attention to the media archaeological moment of the operational machine.

VIII.A Contemporary interface criticism
What is a contemporary interface criticism? Can we disrupt the development of interfaces, and a corporate subsumption of a digital revolution, sketched out above? Are there new ways of reconfiguring the contemporary interface culture? A post-digital response to the interface’ invisible process of immaterialization is a reconfiguration of signal and sign – of the material processes of computation, and their social and political realm; of material and social procedures and protocols. If current materialist practices with bygone media aim to be more than a parenthesis in this reconfiguration (more than a trendy and hipster revival of the old which could just as well be subsumed in trendy new apps for the iPhone), they need to question their notion of material and materialism in a way that embraces a potential for criticism, if not redemption of current interfaces and their culture – in the words of Benjamin a “weak Messianic power” (Thesis II).

VIII.B – Cassette types
Type I Ferric oxide. HF-ES90
Type II Chromium dioxide (CrO2). CR-E II
Type III Ferro-chrome. FeCr90
Type IV Metal-formulated. Metal-ES60

Tape-out leader
How can the carpenter contain the political reality of the historical materialist? How can the historical materialist contain the reality of the material?

Works cited:

Andersen, Christian Ulrik & Søren Bro Pold. “Controlled Consumption Culture.” The Imaginary App. Eds. Paul D. Miller and Svitlana Matviyenko. Cambridge, Massachusetts: The MIT Press, forthcoming. Print.

—. “Controlled Consumption Interfaces.” A Peer-reviewed Journal About 1.2 (2013). Web < http://www.aprja.net/?p=168>

— (eds.). Interface Criticism. Aesthetics beyond buttons. Aarhus: Aarhus University Press, 2011, Print.

Benjamin, Walter. “Theses on the Philosophy of History”. Illuminations. Ed. W. Benjamin. Trans. Harry Zohn. New York: Schocken Books, 1985. Print.

Bogost, Ian. Alien phenomenology, or, What it’s like to be a thing. Minneapolis: University of Minnesota Press, 2012. Print.

Bogost, Ian. “Carpentry va. Art: What is the difference?” Web <http://www.bogost.com/blog/carpentry_vs_art_whats_the_dif.shtml>

Cramer, Florian. “Post Digital Writing.” Anti-Media – Ephemera on Speculative Arts. Rotterdam. Institute of Network Cultures: nai010 Publishers. 227-239. Print.

Cramer, Florian. “Post-digital: a term that sucks but is useful (draft 2).” Post-digital Research. Kunsthal Aarhus. Oct. 7-9, 2013. Web < http://post-digital.projects.cavi.dk/?p=295>

Doctorow, Cory. “The Coming War on General Purpose Computing.” 28th Chaos Communications Congress, Berlin, 2012. Keynote. Web < http://craphound.com/?p=3848>

Earl, J. Cassette Tape Recorders. Watford, Herts: Fountain Press, 1977. Print.

Ernst, W. Towards a Media Archaeology of Sonic Articulations. Paper presented at the Hearing Modern History, Berlin, 2010. Web <http://www.medienwissenschaft.hu-berlin.de/medientheorien/downloads/publikationen/ernst-towards-a-media-archaeology-of-sonic-articulations.pdf>

Ernst, W. “Media Archaeography.” Media Archaeology: Approaches, Applications, and Implications. Eds. E. Huhtamo and J. Parikka. Berkeley and Los Angeles: University of California Press, 2011. 239-255. Print.

Harman, G. Brief SR/OOO tutorial. Jul. 23, 2010. Web <http://doctorzamalek2.wordpress.com/2010/07/23/brief-srooo-tutorial/>

Levy, Steven. Hackers – Heroes of the Computer Revolution. New York: Anchor Press/Doubleday,1984. Print.

Moulthrop, Stuart. Moulthrop, S. 1991. “You Say You Want a Revolution? Hypertext and the Laws of Media”.  The New Media Reader. Eds. N. Wardrip-Fruin and N. Montfort. Cambridge, Massachusetts & London, England: The MIT Press, 2003. 691-704. Print.

Papadopoulos, Georgios. Notes Towards a Critique of Money. Maastricht: Jan van Eyck Academie, 2011. Print.

Pold, S. , A. Prior, and M. Riis. Cassette Memories. Workshop at Roskilde Festival, 2013. Web <http://darc.imv.au.dk/?p=2936>

Rosenbach, M. L. Poitras and H. Stark: “iSpy: How the NSA Accesses Smartphone Data.” Spiegel Online International, Sep. 9, 2013. Web <http://www.spiegel.de/international/world/how-the-nsa-spies-on-smartphones-including-the-blackberry-a-921161.html>

Solfrank, Cornelia. “Women Hackers – a report from the mission to locate subversive women on the net.“ Next Cyberfeminist International. Rotterdam, 1999. Web < http://www.obn.org/hackers/text1.htm>

Object-disoriented sound: listening in the post-digital culture

 

Prelude: the sonic explosion  

For some time, I have been deeply concerned with the mindfulness of listening and the subjective ramifications of auditory perception. The thoughts that envelop these concerns essentially stem from questions of perpetual mobility and nomadism that are perhaps symptomatic of the contemporary post-digital culture. A nomadic listener is affected by a fleeting sound, which appears and diminishes in the way in which it triggers an amorphous stream of subjective contemplation and thoughts bordering on the immediate known-ness of the sonic phenomenon yet simultaneously moving toward the realm of the unknown.

What is the ‘unknown’ embedded in a sonic phenomenon? Does it operate outside of the reality of the sonic objecthood? Even object-oriented philosophers like Graham Harman have argued that the reality of anything outside of the correlation between thought and being remains unknowable. Harman has further criticized early phenomenologists’ approaches to sonic phenomena as reductive, such as “If I hear a door slam, then I hear a door slam, and this experience must be described in all its subtlety; to explain this experience with a scientific theory of sound waves and eardrum vibrations is derivative, since all we encounter directly is the experience of the door slamming” [1]

If we explore such a sonic phenomenon, we may find that a specific sound directs to a listening state inside the listener, who may, in a nomadic condition, indulge in taking the phenomenon as a premise or entryway into a world that he or she did not previously know. The listener may address the sound relating it to the imagining and remembrance of a number of amorphous moods triggered by the temporality of listening, instead of deciphering its objective meaning, location-specific identity, and other spatial information embedded in the characteristic texture and tonality of the sound. Today’s wind may not sound like mere wind, and the lonely screeching of the windowpane may not sound like mere friction between glass and wood; but these may sound like something more abstract in the sense that they are generating memories and imagination of other realities that deviate and refract in response to the immediate materiality of the sonic event. These sounds, as impermanent as they appear to the ears of a wandering listener, may open hidden doors and obscure entrances for further perceptual meanderings in the spiritual realm of contemplation and thoughts transcending the epistemic knowledge-based identity that the sound would otherwise objectify. The epistemological problems and ontological questions posed by such object-disoriented sonic explosion are precisely the area of interrogation and praxis in my current ‘post-digital’ research. Ancient Indian philosophers would call this sonic explosion in terms of ‘dhvani’ and sphōta’ meaning that “A sound changes into language and acquires meaning only after a certain explosion of sounds” (Barlingay 27), accentuating the subjective and mental resonances of sound through which a conceptual entity is perceived by the listener.

Fugue: the post-digital milieu

In order to interpret the provocative term ‘post-digital’ in my own understanding, I wish to underscore the extensive and ever-growing nomadism of agents attuned to the psychogeographic evocation of physical locations and corporeal places in the post-globalized universe of intense mobility. In this nebulous cosmos of rapid flow, the production, mobility, and reception of sound contents are the prerequisites to the decisive aspects of the formation of the notion of ‘post-digital’ via the extensions of social networks, greater interactivity/interpenetration, and personalization of the media. These features result in an increase in mobility and disembedding of sound contents as social acts beyond mere geographical limits. The technologies initiate an awareness of the wider worlds beyond local horizons. But these phenomena are intensely engaged with economic and cultural shifts too. As early as 1995, David Morley was writing about this future in his work Spaces of Identity:

“We emphasize two keys…on the one hand, technological and market shifts are leading to the emergence of global image industries and world markets; we are witnessing the ‘deterritorialisation’ of audiovisual productions and the elaboration of trans-national systems of delivery. On the other hand, however there have been significant developments towards local production and local distribution networks” (Morley 1-2).

Within the merging local-global boundaries, one culture develops constant awareness of the existence of other. Cultural components like images and sounds travel through this dispersed space in mutual interaction, influencing and infusing each other, although the aspects of travel prevail over these implied interactions. These ‘deterritorialised’ wanderings substantially contribute to an emergent culture of primarily mobile and itinerant beings engaged in the liberated ebb and flow of events, phenomena, and ephemera, which operate arguably beyond digital essentialism. This essentialism in digital revolution, which was the predominant theme of the late 1990s and early part of this millennium, starts to dissolve into an ever-growing field of intangible data and immoderate information, with Nicholas Negroponte aptly proclaiming: “Like air and drinking water, being digital will be noticed only in its absence, not by its presence. Face it – the digital revolution is over” (Negroponte 12). Alongside this comes a sense of saturation across the prevailing digital divide between already digital and rapidly digitized contents. During this process, digital media were turning our world into an augmented one. In this rapidly emerging environment, we found that different forms of older media, such as recorded sound and other sound contents, were constantly moving, being relocated, reinterpreted, and engaged in conflict with the purely digital contents within an imminent convergent culture. These sound contents could be as varied as archival sound recordings, clips of music and songs, spoken words, environmental field recordings, and electro-acoustic samples. We could observe a certain movement of these sound contents from a localized state (creative/productive end) to a globalized state (consumptive end) and vice versa. For example, a piece of field recording was digitally mediated so as to be considered a work of sound art, or a ‘traditional’ song from one part of the world was transmitted via the internet to another part of the world as a ‘folk’ song. The question was whether a ‘fluid-local’ sound element was losing its characteristics or retaining its identity over the course of a ‘hyper-global’ shift. We could also ask how such locative sound elements were received and interpreted at the widest end of a rather volatile audience reception within the dissemination of digital media technology and establishment of e-commerce. In this very context, Robert Pepperell and Michael Punt have aptly decoded the term ‘post-digital’: “The term ‘Postdigital’ is intended to acknowledge the current state of technology whilst rejecting the implied conceptual shift of the ‘digital revolution’ – a shift apparently as abrupt as the ‘on/off’, ‘zero/one’ logic of the machines now pervading our daily lives. New conceptual models are required to describe the continuity between art, computing, philosophy and science that avoid binarism, determinism or reductionism” (Pepperell and Punt 2).

The central question arising from interest in the sonic was the ongoing dialogue between older sound contents from primarily locative analogue sources and digitally generated ephemeral traveling sounds, with rapid digitization rendering the interpretation of older/analogue sound contents as digitalized sonic artifacts beyond the mere binarism, determinism, or reductionism of the old vs. new or digital vs. non-digital. The phenomena contributed to the evolving ‘post-digital’ discourse by regarding digitalized artifacts as displaced, relocated, and transformed, thereby dissolving the digital divide between already digital and rapidly digitized contents on the one hand and their reinterpretations as a ‘background’ (Cascone, quoting Ihde) or elusive field of data on the other.

Once this saturation is reached, Kim Cascone argues that, in the domain of sound art and experimental music, “the medium of digital technology holds less fascination for composers in and of itself” (Cascone). In deciphering the term ‘post-digital aesthetics’ in relation to experimental music, he speaks of the “failure” of digital technology and the way in which it triggers subversive practices with glitches, clippings, aliasing, distortion, etc. I, however, perceive this as a failure of a pervasive digital media technology to identify, structure, and archive the transient and elusive sound field from the nameless, placeless, and faceless background world of data. In this world of ‘big data’ (Rasmus Helles and Klaus Bruhn Jensen), ‘data abundance’, and ‘data flood’ (Steve Lohr), itinerant sound data essentially loses its locative character, normative structure (digital, analogue, or digitized), ontological source identity, and epistemic knowledge-based objecthood.

Coda: sounding the post-digital

Such behaviors of sound are accentuated in the post-digital universe of ‘big data’, contributing to the elusive identity of the ‘digital (sound) object’ (compared to ‘non-digital’ objects, devices, and systems) and posing problems of authentication and/or preservation, thereby proliferating a sense of ‘absence’ in a digital sound object’s recognition, identification, and negotiation of the corresponding knowledge-structure upon a network of listening. In their work ‘A theory of digital objects‘, Jannis Kallinikos, Aleksi Aaltonen, and Attila Marton claim that “digital objects are marked by a limited set of variable yet generic attributes such as editability, interactivity, openness and distributedness that confer them a distinct functional profile” (Kallinikos, Aaltonen, and Marton). This leads to a profound sense of ‘instability’ as evasive and fleeting artifacts that contrast with the solid and self-evident nature of already-old sound media, such as sound recordings on tape, CD, file systems, or other types of storage. The fluid and mutating nature of that universe of digital objects and their diffusion across the social fabric make them difficult to authenticate, preserve, or archive in the social memory and knowledge base. These invisible digital objects, carrying multitude of sound contents, problematize their (sound’s) objecthood, rendering them more as ephemera than even artifacts.

On the other hand, sound does indeed seem ‘less esoteric’ in this post-digital milieu because of our “newfound comfort with the immaterial world of pure data and information flowing through the cyberspace” (Dayal, quoting Gopnik). The contemporary media environment allows the separation of sounds from their locations and facilitates their travel across hyper-dispersed networks as background noise. A sound that is disembodied from its locational specificity causes multiple layers of mediation across its multiple receptions and interpretations outside of place, time, and context, whether in an audio streaming network on the internet, a digital sound composition published on a net label, or exhibited within the augmented space of an interactive installation work. In an interactive art piece, identification of a sound event can be understood through its interpretation as an augmented situation for the re-embodied experience by inter-subjective interaction. The post-digital discourse essentially relates to the perpetual transience of these amorphous but fertile auditory situations (Chattopadhyay) into temporality. It is evident that, in this constant flow, the production and reception of sounds over greater mobility and interactivity leads to its interpretation as itinerant auditory situations, which is a transformation of the original sounds, ready for re-interpretation beyond their objecthood in post-digital culture. Admittedly, at this stage, my motivation lies in delving into the question of sound’s object-disoriented behavior upon transient listening.

Variation I: object disorientation of sound

Let me elaborate on what I mean by the ‘object-disoriented behavior’ of sound. To do this, we need to go back in time and excavate the term ‘sound object’. Pierre Schaeffer, arguably the founder of musique concrète, coined the term ‘sound object’ (objet sonore), which paved the way for a new kind of perception, ‘acousmatic listening’. To Schaeffer, the ‘sound object’ was an intentional representation of sound to its listener. With the rise of new audio technologies, the ‘sound object’ recorded on magnetic tape or other media were no longer referred to a sound source, hence the musical exploration of the ‘acousmatic experience’ of sounds that one hears without seeing the causes behind them. The emphasis here was on the reduced listening state instead of causal listening, if we borrow Michel Chion’s terminology. The problem here is the imposition of the word ‘object’ over ‘sound’. The intrinsic flaw in reduced listening as Schaeffer conceptualized it in ‘The Theory of Sound Object’ is that it assumes that sound has an ‘a priori content’ (Demers) that is separate and distinct from any cultural or historical associations it might have subsequently acquired. According to scholars such as Joanna Demers, this assertion is problematic on both practical and theoretical counts. Listeners have difficulty hearing sounds divorced from their associations; at the same time, it is nearly impossible for the human listening faculty not to ascribe a multiplicity of causes to a sonic phenomenon. Furthermore, in practice, the listener is almost certain to simultaneously create imagined gestures or link a sound to its illusory myriad sources, evoking some kind of contemplative and thoughtful imagery in this process of mental resonance and mindful personalization into various listening states.

In his seminal writings, for instance in the famous article ‘Aural Object’, film-sound scholar and early phenomenologist Christian Metz expresses serious doubts about the object specificity of sonic phenomena in scholarly thinking following Schaeffer. He instead focuses on the ‘characteristics’ of sound and emphasizes the problematic aspects of locating sound’s object-oriented or location-specific source. He states that “Spatial anchoring of aural events is much more vague and uncertain than that of visual events” (Metz 29). In classical sound studies (Rick Altman et al.), scholars have already underpinned the issue of sound’s problematic relation to its object or source and emphasized its interpretative nature over its production: “Sound is not actualized until it reaches the ear of the hearer, which translates molecular movement into the sensation of sound” (Altman 19). Altman speaks here of a sound event as defining the trajectory of the essential production and subsequent reception of a sound element. Its narrative, as Altman terms it, is hypothetically bound to the source that produces it. This source, the sounding object when producing sound, is spatially defined or connected to a place. These spatial sources of sound are by definition localized but are not rendered until and unless they are carried by a medium to reach the point of reception. By the same token, a sound is mediated whenever it is digitally registered. Digitization dislocates sounds from their original sources, turning them into discreet data in the nebulous post-digital environment as discussed above. Sound contents are thus only recognized at different stages of digitization toward reaching a saturation state of an assumed ‘post-digital’ economy/ecology, by which process they are freed from the object. Sound thus, by its very nature, implies mobility and subsequent object disorientation in order to establish its recognition in the ‘post-digital’ domain. However, the process of interpretation is more complex than it appears at its perceptual level of reception. Contributing to this discourse, New Media scholar and theorist Frances Dyson argues concerning the ‘sound object’ that “first – find a way of discussing and representing sound unhinged from the visual object, second, find a device (the tape recorder) that will somehow enable such a representation, and finally, mask the mediation of that device by arguing for an ontological equivalence between the reproduced sound and the original sonic source” (Dyson 54). This ontological equivalence might be difficult for a listener to establish in a nomadic condition in which a specific sound presents a multitude of amorphous listening states inside the listener’s mind, leading to a sonic explosion of object-disoriented but mood-based streams of thoughts within the nomadic listener’s consciousness.

Variation II: the nomadic listener

At this juncture, a nomadic listener floating across the post-digital milieu may interact with the background noise or the unknowledgeable sounds of nameless, placeless, and faceless flow of sound data, which inculcates a sort of ‘semantic fatigue’ so that, eventually, they seem cut adrift from the sources or origins (Demers) in the mind of the listener. The listener in this process may sensitize his or her ears to the pseudo-object of the sounds and is able to deconstruct them into his or her listening self through an evocative capacity toward a sonic explosion as streams of timeless sonic states of interconnected reveries, ruminations, and musings. The ‘unknown’ embedded in the wandering shadows of sounds is explored and given a context by the nomadic listener’s intervention into his or her appearing and diminishing, leaving object-disoriented states of feelings or moods.

Variation III: hyper-listening

Let us indulge in further philosophical musings triggered by listening in the post-digital milieu and attend to what John Cage claims to be mindful: “Silence is not acoustic. It is a change of mind” [2]. This will require us to set aside ‘epistemic’ issues of recognizing the source or ‘object’ of sound and instead focus on the subjective and inward perception of sound within the ‘self’ or ‘mindfulness’ of the nomadic listener. Following this methodology, we can examine the way in which the memory, imagination, and personal experience of the itinerant listener alter the character of sound. Taking my point of departure in the epistemological basis of sound object, I now introduce an alternative methodology of listening in the post-digital culture, which I term ‘hyper-listening’, meaning that I intend to relate to the higher-level/psychic pre/post-cognitive processes triggered by listening to the object-disoriented sounds in terms of creating thought-provoking auditory situations. This method perhaps operates on the fringe of what artist Yolande Harris (2011) explains in her doctoral thesis as creating “situations where sound can affect and activate people’s experiences in a personal way” but at the same time expands the idea of ‘experience’ to include conscious contemplation. Much of this argument resonates with Roy Ascot’s recent writings in which he speaks of “interconnectedness, nonlocality and the inclusion of consciousness” [3] embedded in new media art that includes process-based artistic practices with sound and listening. According to Ascot “Process-based art implies field awareness, in contrast to the object dependency of much art practice”. This leads to what he claims to be “the shamanic path to immersion in the spiritual domain, where interaction with psychic entities is the means, transformation of consciousness is the goal and the emergence of new knowledge the outcome” (Ascot). Much of this line of thinking may be arguable, but what is essential is the potential of inclusivity in listening. In his seminal work ‘Listening’, Jean-Luc Nancy  argues that a philosopher is one who hears but cannot listen “or who, more precisely, neutralizes listening within himself, so that he can philosophize.” (Nancy). Operating on the basis of this premise, the methodology of ‘hyper-listening’ challenges the epistemic discourse in sound that equates ‘listening’ with ‘understanding’, ‘audibility’ with ‘intelligibility’, and the ‘sonic’ with the ‘logical’. ‘Hyper-listening’ explores the contemplative and mindful potential of sonic phenomenon at the nomadic listener’s end, emphasizing the indolent mood of elevated thoughtfulness.

Finale: Mind Your Own Dizziness

Addressing a practice-based approach, I explore my ongoing project ‘Doors of Nothingness’ (2012-)[4] and a series of upcoming sound installation/interventions ‘Mind Your Own Dizziness’ (2014-) [5], which incorporate the concept of ‘hyper-listening’. Taking my point of departure in the phenomenological premises of sound, I make the subjective and personal experience the basis of these works, which frame spatial sound phenomena in their entirety, including the mental and emotive context of the listener’s situation. The thought processes activated by sonic phenomena arguably transcend the epistemic comprehension of the source identity of sound toward outlining the auditory situation in a context that delineates the sound events beyond immediately accessible meanings, expanding on and transcending the existing knowledge structure. The works rely on intuitiveness in listening rather than the reasoning involved in deciphering the meaning of ‘aural objects’. The strong belief in inward contemplation, subjectivity, and enhanced ‘selfhood’ available to a nomadic listener (because of his or her ability to free the ears of object specificity, whether spatial, temporal, or locative) mean that the project on one hand explores the personal or private nature of listening while on the other hand engaging with the emergent sonic practices of the implicit post-digital culture.

 

Notes:

[1] Graham Harman quotes Husserl, in Kimbell, Lucy. “The Object Fights Back: An Interview with Graham Harman”. Design and Culture 5(1): 103-117 (2013).

[2] See ‘Where the Heart Beats: John Cage, Zen Buddhism, and the Inner Life of Artists’ by Maria Popova, here: http://www.brainpickings.org/index.php/2012/07/05/where-the-heart-beats-john-cage-kay-larson/

[3] See ‘Technoetic Pathways toward the Spiritual in Art’ by Roy Ascot, here: http://www.facebook.com/notes/roy-ascott/technoetic-pathways-toward-the-spiritual-in-art/10151612039371073

[4] See project page here: http://budhaditya.org/projects/doors-of-nothingness/

[5] See project page here: http://budhaditya.org/projects/doors-of-nothingness/mind-your-own-dizziness/

 

Works cited:

Altman, Rick. Sound Theory/Sound Practice. New York: Routledge, 1992. (Print).

Barlingay, Surendra Sheodas. A Modern Introduction to Indian Aesthetic Theory: The Development from Bharata to Jagannåatha. New Delhi: D. K. Print World, 2007. (Print)

Cascone, Kim. “The Aesthetics Of Failure: ‘Post-Digital’ Tendencies in Contemporary Computer Music”. Computer Music Journal 24.4 Winter (2002). (Web)

Chattopadhyay, Budhaditya. “Auditory Situations: Notes from Nowhere”. Journal of Sonic Studies 4 (Special Issue: Sonic Epistemologies) (2013). (Web)

Chattopadhyay, Budhaditya. “Doors of Nothingness.” In jərˈmān June edition (2012). (Web)

Dayal, Geeta. “Sound art”. theoriginalsoundtrack.com, 2013. (Web)

Demers, Joanna. “Field Recording, Sound Art and Objecthood”. Organised Sound 14.1 (2009): 39-45. (Web)

Dyson, Frances. Sounding New Media: Immersion and Embodiment in the Arts and Culture. University of California Press, 2009. (Print)

Harris, Yolande. Scorescapes: On Sound, Environment and Sonic Consciousness. PhD thesis, Academy for Creative and Performing Arts, Faculty of Humanities, Leiden University, 2011. (Web)

Helles, Rasmus; Jensen, Klaus Bruhn. Introduction to the special issue ‘Making data-Big data and beyond. First Monday, Volume 18, Number 10 – 7 October, 2013. (Web)

Kallinikos, Jannis, Aaltonen, Aleksi, and Marton, Attila. “A Theory of Digital Objects”. First Monday 15.6 (7 June 2010). (Web)

Lohr, Steve. “The Age of Big Data”. The New York Times. 11 February (2012). (Web)

Metz, Christian. “Aural Objects,” trans. Georgia Gurrieri. Yale French Studies 60: 24-32 (1980). (Print)

Morley, David and Robins, Kevin. Spaces of Identity: Global Media, Electronic Landscapes and Cultural Boundaries. London and New York: Routledge, 1995. (Print)

Nancy, Jean-Luc. Listening. (Trans. Charlotte Mandell). New York: Fordham University Press, 2007. (Print)

Negroponte, Nicholas. “Beyond Digital”. Wired, December Issue 6.12 (1998). (Web)

Pepperell, Robert and Punt, Michael. The Postdigital Membrane: Imagination, Technology and Desire. Bristol: Intellect Books, 2000. (Print)

 

Budhaditya Chattopadhyay

Post Digital Liveness in Software

Latest version

Post Digital Liveness in Software

“Software has become our interface to the world, to others, to our memory and our imagination”  (Manovich)

Since the invention of the Internet and World Wide Web, data is massively generated, consumed, manipulated, reproduced and circulated in the network culture. The demand of real time delivery of data is tremendously high through ubiquitous software; these can be seen especially in developed cities such as London and Hong Kong. Browsing online data, such as news with text and images, through a mobile browser; communicating with peers through what’s app and Facebook/Twitter become part of everyday life activity. Software has permeated in everyday life from physical to networked environments. It can be situated in a physical mobile device and computer, but also can be available on the Internet such as social media applications. Functions that have been built in software, together with network technologies, have made the increasing demand of instantaneity become possible.  One could access or interface the world instantly, getting close to events that are happening in other part of the city, and even remotely connects to the world.  These experiences of proximity and immediacy through screen representations, which is manifested via technology, have been constituted to the understanding of liveness (Scannell 84; Zemmels; Bourdon 532).

Likewise, the notion of liveness has been expressed in the form of digital art through the utilization of online data. Producing immediate, dynamics and unpredictable live experiences become one of the artistic representations (artworks such as Listening Post by Mark Hansen and Ben Rubin, eCLOUD by Aaron Koblin, Hefermaas and Dan Goods, The Ryhthm of City by Mar Canet and Varvara Guljajeva).

Audiences can be easily noticed the live-happenings through the manifestation of artist’s software. But in the post-digital discourse, how do the concepts of imperfect digital process (Ian, 2000) or even technological failure (Cascone, 2000) might provide an alternative understanding of liveness? This essay tries to open up the discussion through an artistic research and practice, with a focus to examine the technical and political digital processes that shape an artwork. Ian suggests post-digital approach is not to examine functions and “mundane tasks” of the software application, but to think about processes “as a combination of the material processes”. In my articulation of post-digital approach in this specific context of art, including both network and software, I would like to put forward post-digital liveness is realized through examining “material substrate”, that is the black box of the software and its related software code support; rather to analyze the representation of the artwork and its reception from audiences.

An overview of (digital) liveness

The notion of liveness that I refer to here is associated with technology and is situated in a digital environment. The term ‘liveness’ has been widely used in various media and performance contexts to describe the actual happening of events, and is often tied to the reception of audiences via data representation. For example, watching a live broadcast programme, in the form of audio and visual, via a television; reading an Internet post, in the form of text and image in the Facebook application, via a mobile device. The experience and feeling of liveness have been discussed in former literatures (Auslander; Scannell; Zemmels).

Theorist Auslander argues that liveness is a contingent term due to the change of technological environment. His concept of liveness is fundamentally grounded on recording technologies, in which “the live can be defined only as that which can be recorded” and “the live is actually an effect of mediatization” (56).  As such, the feeling or experience of liveness is being manipulated by technology. Derrida and Stiegler remind that ‘live’ in any transmission that is based on recording technologies is never live, instead, it is artificially transformed audiences’ perception and experience via constant manipulation (40). These technologies enable a proximal relationship establish towards a source event, the sensation of presence at a remote environment that is happening or has been happened at other time and space. The representation of data “can powerfully produce the effect of being-there, of being involved (caught up) in the here-and-now of the occasion” (Scannell 84).

Within the domain of the Internet, Zemmels argues the notion of presence is substantially intensified due to a shorter retrieval time of accessing specific data within a larger amount of information. Considering video streaming nowadays whereby data is stored in servers and databases, machine codes and real time network technology allow selected data to be reproduced, manipulated and streamed as live. Zemmels’ notion of presence constitutes the experience of “immediacy” and “intimacy” through instant delivery and connection over distance.

In fact, the demand of instant delivery that Gere describes as instantaneity is permeated in digital culture (1). Using real time technology to deliver data has become one of the important features in all sorts of technological artifacts. Parisi defines real time as follow:

“The capacity of software of media technologies to retrieve information live, and to allow this information to add new data to programming. Real-time technologies can be only understood in terms of the ‘aliveness’ of data” (266).

The real time manipulation and transmission of data is live itself insofar as liveness implies a capacity to transmit and deliver a message “as it happens” (Marriott 69). Liveness can be examined through detaching from content when it just points towards to a live transmission per se (Bourdon 534).

The ‘movement’ of data in a transmission process consists not only active technical transmission flows from one side to the other, but also socio-cultural process.  Indeed, data production comes with ideologies, reflecting present culture, thinking, beliefs, interactions and daily living. These are embedded and synthesized in produced content, for instance, the production of a reality TV show or the posted picture on Facebook. As Feuer discusses the representation of content “is a reflection of the living, constantly changing present” (13) and therefore, it is always in a state of becoming and the perceived present has always been mediated. Liveness is an ideology, promoting a sense of current happening.

Moving into post-digital liveness

Another recent perspective of liveness discourse is to look into the matters of life. Pugliese uses the word “liveness test”, denoting “a sign of life” and the presence of a human body in the context of biometric systems of identification (Pugliese 118). The term “liveness detection” is used in a similar fashion (Tan & Schuckers). In 2011, Transmediale Festival conducted a conference BODY:RESPONSE – Biomedial Politics in the Age of Digital Liveness, it suggested that networked environments and technology have been shifting the understanding of living body from biological to the “social and political” body, which is extended from online society. Technology governs social body and social relations through online platforms, communication devices and application gadgets.  The way on how a body connects to society is dramatically expanded through social practices in online environments. The use of contemporary philosophy of biopolitics is also seen in various scholars’ writing (Pasquinelli; Liu; Parikka; Karppi; Munster), opening up critical perspectives between politics and networked body.  In the context of digital liveness, biopolitics, draws upon Foucauldian discourse analysis, is about digital life (such as life expectancy and health condition of a network/artifact/software), regulatory controls, social relations, production, reproduction and population. Parikka discusses liveness in the context of software-based art archives, in which the documentation represents the “living environment”(124) of a particular moment of time. Therefore, the understanding of liveness is also exposed to the nonhuman body discussion.

This article uses my collaborative artwork as a case study to open up discussions on the biopolitics of post-digital liveness, which is not on a human body but on a digital networked body. It is a piece of software connects to a network platform Facebook. However, the artwork is intended not to address on well-functioned, perfect and promising qualities and representations. As post-digital concerns the imperfect or what Ian would describe as “flaws inherent in digital processes”, post-digital liveness is, indeed, to investigate the digital life process that leads to software flaw, which is also similar to Cascone’s notion of ‘failure’: “bugs, application errors [and] system crashes”.

Liveness: The representational experience through the screen

In this article, a small application software is made collaboratively by Helen Pritchard and I, and is commissioned by Arnolfini. The artwork The likes of Brother Cream Cat is a browser add-on that provides an augmented browsing experience of Facebook through traces of a Facebook famous cat, Brother Cream in Hong Kong, investigating the social process between developer’s software and the online social media platform Facebook.

In 2011 “Brother Cream Cat” was lost on the street and his fans created a Facebook fan page to find him, and on his return he became ‘Facebook Famous’ through his ‘lots of likes’ (Soon & Pritchard). Brother Cream Cat’s attraction permeates in both physical and digital live network. Since his lost and found, he engages over 1000 first time and revisit fans per day at his store and has accumulated with more than 150,000[1] likes on his Facebook fans page. The likes becomes an instrument, as well as a starting point, to sustain his well being by attracting more visitors (both online and offline), more merchandised products, more cat food and more job opportunities for this animal celebrity, Brother Cream.

An add-on is developed to intervene the Facebook browsing behavior in real time. Once audiences have installed and activated the add-on, the small software then runs on a browser, all the existing Facebook’s image data (including images of any post, the profile and timeline area) are replaced with the latest available Brother Cream trace (See image 1). When he/she visits Brother Cream fan page particularly, all the cats’ images that are uploaded by his fans are overlaid with a customized line of text (see image 2); and the tailored text and audio respond instantly once the like/unlike button of a Cream Cat’s post is clicked.  As such, the add-on intervenes the usual behavior of browsing and using Facebook through a custom-made software. The add-on offers a real time augmented browsing experience. The image data on Facebook is constantly mutating and the live trace is participating actively in human social interaction through real time technology, including the network and software. The liveness of Brother Cream is experienced through the representation of text, audio and image data, allowing instant feedback responses towards users’ click action.

Picture 6Image 1: Screen shot of The likes of Brother Cream Cat on Facebook

Picture 11 Image 2: Screen shot of The likes of Brother Cream Cat on Brother Cream fan page

Critical Code Studies: examining post digital liveness

In addition to the liveness of specific representational objects of Brother Cream Cat, the project also examines the notion of post-digital liveness from a wider cultural context through an in-depth investigation behind the screen.

The add-on is made to address the notion of liveness through continuously scraping Facebook data and intervening in the user experience of browsing Facebook in real time. However, like any other software production, the add-on could potentially malfunction, and would lead to a newer version release. In this post-digital era, one has to think beyond the polished screen and software, departing from the critical reflection of software disruption. This article, therefore, suggests that the potential malfunctioning add-on might provide an insight in rethinking liveness from the representational to socio-technical and socio-political realization, arguing that a newer software version is not simply regarded as a new fix or a new update, but it encompasses social forces which shape the post-digital liveness of a software.

To understand social forces within and around Facebook, Pritchard and I take the approach of critical code studies, initiated by Mark Marino in 2006, a method to study code itself rather then to focus on the representation, the usability or interface design of a software. Studying how the algorithm is implemented might not be necessary and according to Marino, “code itself as a cultural text worthy of analysis and rich with possibilities for interpretation”. The available Facebook code, including but not limited to source code, application programming interface (API), Facebook developers site and its documentation and the terms and conditions provide a useful way to understand the architecture of the Facebook infrastructure, as well as biopolitical implications in a border social and cultural context.

Investigating post-digital liveness: more than just technical APIs

Application programming interface (API) is a standard interface offered by Web 2.0 service providers. Developers, designers, artists and anyone can register platforms’ account and be able to retrieve services and online data via the use of API in their developed software. In particular for network art[2], arguably, there is a growing trend for artists, for example JODI, Jonathan Harris & Sep Kamvar, Jer Thorp, Shu Lea Cheang, to employ available API in their works. As such, more artistic data practices have been brought to the network art scene, and this public interface, the API, becomes the “art-making enabler” (Soon).

In Facebook, the release of API provides much broader opportunities to enhance its popularity on the Internet and sustain its business inasmuch as more third-party applications are being developed. Thus, more data pass through the API from Facebook databases, reproducing and appearing in other interfaces, and this has been known as ‘Facebook apps’. However, API should not be only considered as a tool, but has to be understood from a socio-political perspective on how providers manage or govern their data usage, encompassing a highly complex socio-technical relation.

Facebook releases their API since 2006 and developers have to comply with their rules, including from concrete to ambiguous instructions. For example, the limit of query request per day via developers’ programs, and one of the conditions marks: “Quality of content: you are responsible for providing users with a quality experience and must not confused, defraud, mislead, spam or surprise users.”[3] Clearly, the rules are set at a maximum beneficial to Facebook. In this regard, I am wondering if The likes of Brother Cream Cat surprises users, via a messy interface and bizarre interaction, as an artwork? Undoubtedly, Facebook has rights to withdraw and block the application’s access for data retrieval, and even reserved rights for any legal actions that they might take. Being a well-behaved developer, on the contrary, is guaranteed a stable delivery of data (Bucher).

Developer is in a passive position in using Facebook service to conduct data extraction, even though data is contributed by the public domain freely, and Facebook basically has the full control on granting the access and decide what data should be opened from databases and made available to public through algorithm, a technical execution of data inclusion and exclusion in this regard. All the users data fundamentally “is the sole and exclusive property of Facebook” (Lodi 242). Since all Facebook apps have to go through a registration process and under constant monitoring, in order words, Facebook is controlling what should be made available in the market, cultivating a desire and favorable apps through the labor market, and governing the whole population of developers’ community.

How could one escape from all these conformities? The likes of Brother Cream Cat is made to escape from these rituals and monitoring by using an alternative and conventional method, yet not properly verified and approved, as web scraping.

Investigating post-digital liveness: The use of web scraping technique

Before the wide availability of APIs in the late 90s that are released by Web 2.0 providers, developers or artists had to use web-scraping technique only to harvest web data. Web scraping is an automatic process of web data extraction, written by a computer scripting language, in which “specific fields or data elements [is extracted directly] from pages on the Web and other Internet sources” (Marres & Weltevrede 316). Authorization is not required, one can easily program a script and start fetching the web data, however, Marres amd Weltevrede remind the possible legality issue of web scraping as it may against its “terms of use” (320).

Marres and Weltevrede further discuss the extracted dirty web data (322) in using the Web scraping technique. The source is hardly understood without proper revealing of data schematics, and the web data collection process is “unstructured” (316) and “messy” (322). In addition, web scraping is considered as an unstable method because there are substantial changes of web interfaces and data elements from the source (Tseng 2), which impacts the apps development.

In fact, neither approaches of web scraping or API is stable. Though Facebook tries to maintain their platform stability by giving advanced notice of API code changes and offering a more comprehensive documentation and guidelines, still many developers suffer from their frequent code update. According to an online web service company[4], the related documentation and services of the Facebook API have a total of 64 changes in just 30 days. An engineer, Chunk, works in Facebook and he announces that Facebook at least update their code (not only for API but Facebook as an entire platform) in a daily basis for different enhancement purposes to sustain its entire economic activities and Facebook population.

Nevertheless, The likes of Brother Cream Cat’s add-on is expected to cope with these platform changes by continuously update the add-on software with different versions – just like any other software practice – in order to maintain the liveness and functioning of the artwork.

Software versioning: the production of software release

What makes a provider introduces a newer software release? Chunk responses that the Facebook software update provides greater software. Perhaps, it can be understood as “greater” interfaces, “greater” functions and “greater” stability to drive Facebook business, keeping users and expanding possible online connections. Mark Zuckerberg, the CEO of Facebook, highlights, “we work to bring the next five billion people online and into the knowledge economy”[5]. Therefore, Facebook population is foreseeable to expand continuously. Given the advertising revenue with a 66% increment from 2012 to 2013, it is understood the direct and inter-related forces that exist between healthy (active) users, business relation and monitoring system of Facebook. Every new update of the software can be seen as a production in the world of capitalism. It is an economic process yet exists in a technical practice of code release, controlling the network population in relation to the machinery of production. In other words, the software signifies “a power to foster life” (Foucault 138) and entangles with the optimization of efficiency and effectiveness that directs the engaging forces from macro interactions among advertisers, technology and users to micro individual behaviors. Post-digital liveness includes hidden forces that foster life. In the context of software, it is the continuation of keeping the smooth running of it. Hence, prolonging the active connectivity in a socio-political dimension.

In fact, these software changes have been commonly seen in nowadays software culture such as hotfixes or security updates from operating systems, and software updates from other different kinds of applications. The reasons behind ranges from protecting security and privacy of users, to the offering of better experiences, features and functions. Arguably, one of the hidden agenda for software companies is to implement a range of mechanisms to reinforce the controlling, monitoring and optimizing via data tracking. Facebook is one of the examples that actively analyze user behaviors, such as tracking users’ cursors on screen behavior that has been reported by New York Times[6] in 2013. As such, software should also be considered as a control apparatus, with power that exercises on individual live connection -as life- through the black box of algorithms. These micro tracking techniques are implemented down to individual level that drives potential consumption. All these controls are hidden but integrate in the normal release of software that offers an uncompleted and distorted picture to its members. According to Foucault, the notion of life is biopolitical and consists of disciplinary power that is “centered on the body as a machine”. He explains the power as:

“its disciplining, the optimization of its capabilities, the extortion of its forces, the parallel increase of its usefulness and its docility, its integration into systems of efficient and economic controls, all this was ensured by the procedures of power that characterized the disciplines” (p.139).

Social Reproduction via APIs: extension of life via third-party production

As a production platform, Facebook’s population includes not only healthy users who frequently and actively engage with Facebook from a user perspective, but also those external communities who participate in developing Facebook apps. API is one of the ways in which Facebook extends its active user population through third-party applications. Facebook offers a comprehensive guidelines and interfaces of APIs, facilitating the reproduction of users data and the production of “Facebook apps” in a creative way. This is how Hardt and Hergri would describe as biopower, “a situation in which … power is the production and reproduction of life itself” (24). For example, a mobile app Candy Crush Saga has implementing a life system, when players have used up all their default maximum number of given lives, lives are able to restore by asking Facebook friends for help. This social interaction of getting extra lives have been implemented via the Facebook API by posting a request message on a friend’s wall within the app, as well as accepting the help message by the player’s friend, then Facebook will inform the app of getting extra lives to continue the game play. As a result, the social reproduction is made possible via third-party software.

Accessing to Facebook databases with the API is motivating to developers, it immediately creates network of relations via individual behavior, for instance, the actions of like, post or share. Gerlitz and Helmond would describe this as “interconnected” (7) network relation, whereby Facebook data keep circulate among network of networks exponentially.  They point out that Facebook is intentionally implemented in their business as part of Zuckerberg’s agenda, which is “to build a more comprehensive map of connections and create better, more social experiences for everyone”[7]. Thus, this social reproduction is in conjunction with wealth and desire, producing “subjectivities” like “needs, social relations, bodies, and minds” (Hardt & Hegri 32). In view of a socio-technical and socio-political context, API is contributing to the liveness of both developer’s software and Facebook platform. To Facebook, these new relations are enriching the entire business of Facebook. Apps user will be recruited and attracted through third-party software and hence, affecting the dynamics of Facebook population.

Nevertheless, third party applications have to keep update in order to cope with Facebook changes and to keep up with the latest technology. In 2010, Facebook announced significant changes towards the API with the introduction of Open Graph format via the Open Graph Protocol[8], this also implies the deprecating of the former “REST API”[9]. In fact, backwards compatibility or legacy support has been seen as highly time-consuming and expensive for maintenance (Bisbal et al. 103). Therefore, companies tend not take the approach of supporting both new and old systems. Facebook as one of the listed companies also has to be cost effective in growing its revenue and business. Therefore, despite new features are no longer supported in the old API format, those existing functioning features gradually removed from the Facebook platform[10] entirely. As such, developers are forced to change their software to avoid potential malfunction.

Software failure: The malfunction of The likes of Brother Cream Cat

The use of web scraping technique verses standard API though can achieve the same result as an add-on, allowing Brother Cream cat permeates in the network. However, the use of different code crafting method and language is more than a technical implementation. Indeed, code has a ‘voice’ in this artistic context to maintain the liveness of the software and escape from the regulatory control by Facebook. Cox argues machine code not only regards as an instrument for executing creative instructions, but also encloses “subjectivity and sociality” that “connects with political expression and allows for a wider understanding of power relations” (3).

Indeed, the versioning of software refers to the disappearance of old interfaces, old functions, old regulations and policies. In The likes of Brother Cream Cat, a malfunction add-on means the death of live connections to Facebook in a literal sense. Using web-scraping technique might prolong its life and escape from the disciplinary practice of Facebook, but still hardly escape from the frequent code changes and releases of the Facebook Empire. The fragility of the add-on, The likes of Brother Cream Cat, expresses the notion of post-digital liveness through the possible software failure in both a conceptual and practical level.

In this artwork, the socio-forces have the capacity to keep the artwork as well functioned and live, but also can lead to malfunction and to failure. In a wider context of software culture, the revision of software, on the one hand, providing enhanced features like an update or a fix from a technological and functional perspective; on the other hand, it also documents the changes, history and a moment of technological media environment, including but not limited to the capitalist, mainstream and commercial demand, conformity, political decisions, regulatory control and ideological practices. As such, investigating the socio-political and socio-technical forces that exist behind, and beyond, the screen representation might help to understand the constitution of post-digital liveness in software.

 //4193 words


[1] The data is gathered from the Brother Cream Cat’s Facebook fan page: https://www.facebook.com/pages/%E5%B0%96%E6%9D%B1%E5%BF%8C%E5%BB%89%E5%93%A5/117969648299869

[2] The genre of network art is usually grouped under the bigger umbrella of media art, meaning art based in or on Internet Cultures, examining the everyday medium of the Internet.

[3] See the full Facebook developers’ policy here: https://developers.facebook.com/policy/

[4] The data is collected as of 25 Nov 2013 through the website: https://www.apichangelog.com/api/facebook

[5] See the statement appears in the “Facebook Reports Third Quarter 2013 Results” here: http://investor.fb.com/releasedetail.cfm?ReleaseID=802760

[6] See the news report, Facebook Tests Software to Track Your Cursor on Screen, here: http://blogs.wsj.com/cio/2013/10/30/facebook-considers-vast-increase-in-data-collection/

[8] See the post- Building the Social Web Together here: https://www.facebook.com/notes/facebook/building-the-social-web-together/383404517130

[9] See the details of the deprecated REST API here: https://developers.facebook.com/blog/post/616/

[10] See the Facebook developer roadmap here: https://developers.facebook.com/roadmap/completed-changes/

////////////////////////////////////////////////////////////////////////

Version 1: older version

Behind the social network: Rethinking liveness in software

“Software has become our interface to the world, to others, to our memory and our imagination”  (Manovich)

Software has permeated in everyday life from physical to online networked environments, and has become ubiquitous at least in economically advanced societies. Functions that have been built in software, together with network technologies, have made the increasing demand of instantaneity through the real time delivery become possible in contemporary culture. This experience of proximity and immediacy through screen representations to the world has been constituted to the understanding of liveness.

With the increasing penetration of software in everyday life, software becomes a medium to offer live data with feedback and response through computational processes. Indeed, each software has a life expectancy-the period of time that the software is still well functioned and executed-that implies in the incremental number count through software versioning. How does software shape our understanding of liveness-the matters of live and life-that exist behind, and beyond, the screen representation? This article suggests examining liveness from two key perspectives. First, it relates to the real time data manipulation, pointing directly to the live transmission process within computational systems. The other perspective is to look into the matters of life, in particular of life expectancy, and its cultural implications of software. Using Marino’s critical code studies and coding practice, this article explores the notion of liveness in relation to software. These approaches neither about the representation and semiotics analysis of a screen nor phenomenological study, but to investigate the hidden forces that exist behind the screen.

About software

Software is broadly defined as a program with instructions that has to be run by a computer. Instructions include logics and functions such that data, parameters and their values could be manipulated. Therefore, software does not only means industry software that are available for selling, such as graphics or accounting software, but also includes software that is situated freely on the Internet such as Facebook, and software that is independently developed by artists and developers. This article focuses software that exists in networked environments.

In this article, a small application software is made to investigate the socio-technical process between developer’s software and the online social media platform Facebook. The artwork The likes of Brother Cream Cat (Soon & Pritchard) is a browser add-on that provides an augmented browsing experience of Facebook through traces of a Facebook famous cat, Brother Cream in Hong Kong.

The add-on is made to address the notion of ‘live’ through continuously scraping Facebook data and intervening in the user experience of browsing Facebook in real time. However, like any other software production, the add-on will be potentially malfunctioned and will lead to a newer version release. In this post-digital era, one has to think beyond the polished screen and software, departing from the critical reflection of software disruption. This article, therefore, suggests that the potential malfunctioning add-on might provide an insight in rethinking liveness from the representational to socio-technical and socio-political realization, arguing that a newer software version is not simply regarded as a new fix or a new update from a technical perspective, but it encompasses with social forces which shape the liveness of a software.

An overview of (digital) liveness

The notion of liveness that I refer to here is associated with technology and is situated in a digital environment. The term has been used in various media and performance contexts to describe the actual happening of events, and is often tied with the reception of audiences via data representation. Some of the related terms like live broadcasting and real time technology have been discussed extensively by various scholars (Gere; Feuer; Auslander; Donati & Prado), and characteristics of liveness, immediacy and presence for example, have been discussed in former literature (Auslander; Scannell; Zemmels).

Theorist Auslander argues that liveness is a contingent term due to the change of our technological environment. His concept of liveness is fundamentally grounded on recording technologies, in which “the live can be defined only as that which can be recorded” and “the live is actually an effect of mediatization” (56). The concept of reproducibility is not only applicable to performance art, but to wider forms of media that acquire technology, such as broadcasting in radio and television, real time technology in internet. Similarly, Derrida and Stiegler remind that ‘live’ in any transmission that is based on recording technologies is never live, instead, it is artificially transformed audiences’ perception and experience via constant manipulations (40). These technologies enable a proximal relationship establish towards an event happens; the sensation of presence at a remote environment that is happening or had been happened at other time and space. The representation of data “can powerfully produce the effect of being-there, of being involved (caught up) in the here-and-now of the occasion” (Scannell 84).

Within the domain of the Internet, Zemmels argues the notion of presence is substantially intensified due to a shorter retrieval time of accessing specific data within a larger amount of information. Consider video streaming nowadays, data is stored in servers and databases, machine codes and real time network technology allow selected data to be reproduced, manipulated and streamed instantly across regions as live. Zemmels’ notion of presence constitutes the experience of “immediacy” and “intimacy” through instant delivery and  connection over distance.

In fact, the demand of instant delivery that Gere describes as instantaneity is permeated in digital culture (1).  Using real time technology to deliver data has become of the important features in all sorts of technological artifacts. Parisi defines real time as follow:

“The capacity of software of media technologies to retrieve information live, and to allow this information to add new data to programming. Real-time technologies can be only understood in terms of the ‘aliveness’ of data” (266).

The real time manipulation and transmission of data, regardless of tv, radio or the Internet medium, is live in itself insofar as the immediacy of liveness is the capacity to transmit and deliver the message “as it happens” (Marriott 69). Likewise, Bourdon argues liveness is not a matter of the transmission content, but the live transmission itself (534). However, the ‘movement’ of data in a transmission process consists not only active technical transmission, but also socio-cultural processes. Indeed, the production of data is about present culture, ideologies and daily living, these are constantly shifting and synthesizing in the content per se. As Feuer discusses the representation of content “is a reflection of the living, constantly changing present” (13) and therefore, it is always in a state of becoming and the present is constantly being mediated. Perhaps, the transmission of data might also be considered as a live process, and liveness is always just a snapshot of a temporal and becoming moment, at the same time representing a state of being and is regarded as assemblages. “[A] combination of heterogeneous elements” (Callon and Caliskan 9) that involve the interaction between technical elements, social elements and cultural elements.

The element of biopolitics has been also seen in recent discourse of liveness, signifying the matters of life in relation to politics. In fact, contemporary philosophy of biopolitics, such as the articulation from Giorgio Agamben and Hardt & Hegri, is largely based on Michel Foucault’s series of writing in the 1970s. His concept of biopolitics is based on the correlation of biological body and life, governmental technologies and sovereign power within a political context. But it gets to extend from a physical living body to a wider network and social body (Parikka; Berardi) that fit into contemporary digital culture and the online neoliberal environmental governance.  In the context of digital liveness, biopolitics is about digital life (such as life expectancy and health condition of a network/artifact/software), regulatory controls, social relations, production, reproduction and population. It becomes a critical approach to reprehend the phenomena and politics of Internet culture (Pasquinelli; Liu; Parikka, Karppi; Munster). As such, liveness points directly to the matters of life and politics rather than audiences’ experience and perception, nor the fact of the transmission and interaction within a computational system.

The liveness of the animal celebrity, Brother Cream Cat

Liveness: The representational experience

This article introduces a network art project, The likes of Brother Cream Cat (2013), that addresses liveness from a socio-technical perspective. It is an add-on that functions on Facebook browsing, and is the most current collaborative and artistic production of Helen Pritchard and Winnie Soon. In 2011 “Brother Cream Cat” was lost on the street and his fans created a Facebook fan page to find him, and on his return he became ‘Facebook Famous’ through his ‘lots of likes’ (Soon & Pritchard). Brother Cream Cat’s attraction permeates in both physical and digital live network. Since his lost and found, he engages over 1000 first time and revisit fans per day at his store and has accumulated with more than 150,000[1] likes on his Facebook fans page. The likes becomes an instrument, as well as a starting point, to sustain his well being by attracting more visitors (both online and offline), more merchandised products, more cat food and more job opportunities for this animal celebrity, Brother Cream. These entanglements have made apparent through the artists’ strategy of ‘likes’ exaggeration and intervention.

The add-on is developed to intervene the Facebook browsing behavior in real time. Once audiences have installed and activated the add-on, the small software then runs on a browser, all the existing Facebook’s image data (including images of any post, the profile and timeline area) is replaced with the latest available Brother Cream trace (See image 1). When he/she visits Brother Cream fan page particularly, all the cat’s images that are uploaded by his fans are overlaid with a customized line of text (see image 2); and the tailored text and audio respond instantly once the like/unlike button of a Cream Cat’s post is clicked.  As such, the add-on intervenes the usual behavior of browsing and using Facebook through a custom-made software. The add-on offers a real time augmented browsing experience. The image data on Facebook is constantly mutating and the live trace is participating actively in human social interaction through real time technology, including the network and software. The liveness of Brother Cream is made apparent through the representation of text, audio and image data, allowing instant feedback responses towards users’ click action.

Picture 6Image 1: Screen shot of The likes of Brother Cream Cat on Facebook

Picture 11 Image 2: Screen shot of The likes of Brother Cream Cat on Brother Cream fan page

Critical Code Studies: examine software liveness

In addition to the liveness of specific representational objects of Brother Cream Cat, the project also examines the notion of software liveness from a wider cultural context through an in-depth investigation behind the screen. To understand social forces within and around Facebook, the artists take the approach of critical code studies, initiated by Mark Marino in 2006, a method to study code itself rather then to focus on the representation, the usability or interface design of a software. Studying how the algorithm is implemented might not be necessary and according to Marino, “code itself as a cultural text worthy of analysis and rich with possibilities for interpretation”. The available Facebook code, including but not limited to source code, application programming interface (API), Facebook developers site and its documentation and the terms and conditions provide a useful way to understand the architecture of the Facebook infrastructure as well as biopolitical implications in a border social and cultural context.

More than just technical APIs

Application programming interface (API) is a standard interface offered by Web 2.0 service providers. Developers, designers, artists and anyone can register platforms’ account and be able to retrieve services and online data via the use of API in their developed software. In particular for network art[2], arguably, there is a growing trend for artists, for example JODI, Jonathan Harris & Sep Kamvar, Jer Thorp,  Shu Lea Cheang, to employ available API in their works. As such, more artistic data practices have been brought to the network art scene, and this public interface, the API, becomes the “art-making enabler” (Soon).

In Facebook, the release of API provides much broader opportunities to enhance its popularity on the Internet and sustain its business inasmuch as more third-party applications are being developed. Thus, more data pass through the API from Facebook databases, reproducing and appearing in other interfaces, and this has been known as ‘Facebook apps’. However, API should not be only considered as a tool, but has to be understood from a social-political perspective on how providers manage or govern their data usage, encompassing a highly complex socio-technical relation.

Facebook releases their API since 2006 and developers have to comply with their rules, including from concrete instructions, for example, the number of query request per day via developers’ program and the availability of an explicit ‘log out’ option in the apps’ interface, to vague conditions. One of the conditions marks: “Quality of content: you are responsible for providing users with a quality experience and must not confused, defraud, mislead, spam or surprise users.”[3] Clearly, the rule is set at a maximum beneficial to Facebook. In this regard, I am wondering if The likes of Brother Cream Cat surprises users as an artwork? Undoubtedly, Facebook has rights to withdraw and block the application’s access for data retrieval, and even reserved rights for any legal actions that they might take. Being a well-behaved developer, on the contrary, is guaranteed a stable delivery of data (Bucher).

One thing has to remind is that developer is in a passive position in using their service even though data is contributed by the public domain freely, and Facebook basically has the full control on granting the access and decide what data should be opened from databases and made available to public through algorithm, a technical execution of data inclusion and exclusion in this regard. All the users data fundamentally “is the sole and exclusive property of Facebook” (Lodi 242). Since all Facebook apps have to go through a registration process and under constant monitoring, in order words, Facebook is controlling what should be made available in the market, cultivating a desire and favorable apps through the labor market, and governing the whole population of developers’ community.

API is one of the ways in which Facebook extends its active user population through third-party applications. These will establish social relations and utilize labors outside Facebook members through developing apps with APIs. Facebook executes a tight regulatory control but at the same time keeping Facebook platform active via relying on external labor contribution. Facebook offers a comprehensive guidelines and interfaces of APIs, facilitating the reproduction of users data and the production of “Facebook app”. This is how Hardt and Hergri would describe as biopower, “a situation in which … power is the production and reproduction of life itself” (24).

How could one escape from all these conformities? The likes of Brother Cream Cat is made to escape from these rituals and monitoring by using an alternative and conventional method, yet not properly verified and approved, as web scraping.

The use of web scraping technique

Before the wide availability of APIs in the late 90s that are released by Web 2.0 providers, developers or artists had to use web-scraping technique only to harvest web data. Web scraping is an automatic process of web data extraction, written by a computer scripting language, in which “specific fields or data elements [is extracted directly] from pages on the Web and other Internet sources” (Noortje & Esther 316). Authorization is not required, one can easily program a script and start fetching the web data, however, Noortge and Esther remind the possible legality issue of web scraping as it may against its “terms of use” (320).

Noortge and Esther further discuss the extracted dirty web data (322) in using the Web scraping technique. The source is hardly understood without proper revealing of data schematics, and the web data collection process is “unstructured” (316) and “messy” (322). In addition, web scraping is considered as an unstable method because there are substantial changes of web interfaces and data elements from the source (Tseng 2), which impacts the apps development.

In fact, neither approaches of web scraping or API is stable. Though Facebook tries to maintain their platform stability by giving advanced notice of API code changes and offering a more comprehensive documentation and guidelines, still many developers suffer from their frequent code update. According to an online web service company [4], the related documentation and services of the Facebook API have a total of 64 changes in just 30 days. An engineer, Chunk, works in Facebook and he announces that Facebook at least update their code (not only for API but Facebook as an entire platform) in a daily basis for different enhancement purposes to sustain its entire economic activities and Facebook population.

Nevertheless, The likes of Brother Cream Cat’s add-on is expected to cope with these platform changes by continuously update the add-on software with different versions – just like any other software practice – in order to maintain the liveness and functioning of the artwork. The add-on uses web scraping as opposed to Facebook API, and Brother Cream Cat permeates into the network via Facebook browsing, intervening users’ usual browsing behavior regardless of their regions and time zones. The software keep parsing the web scraped data from Cream Cat fan page, and the live traces of Brother Cream cat that appears on screen is attached with invisible social entanglements including commerce, data laws, copyright and geo/biopolitics.

Software versioning: the production and reproduction of code

The production of software release

What makes a provider introduces a newer software release? Chunk responses the Facebook software update provides greater software. Perhaps, it can be understood as “greater” interfaces, “greater” functions and “greater” stability to drive Facebook business, keeping users and expanding possible online connections. Mark Zuckerberg, the CEO of Facebook, highlights, “we work to bring the next five billion people online and into the knowledge economy”[5]. Therefore, Facebook population is foreseeable to expand continuously. Given the advertising revenue with a 66% increment from 2012 to 2013, it is understood the direct and inter-related forces that exist between healthy (active) users, business relation and monitoring system of Facebook. Every new update of the software can be seen as a production in the world of capitalism. It is an economic process yet exists in a technical practice of code release, controlling the network population in relation to the machinery of production. In other words, the software signifies “a power to foster life” (Foucault 138) and entangles with the optimization of efficiency and effectiveness that directs the engaging forces from macro interactions among advertisers, technology and users to micro individual behaviors. Liveness exists in a software include forces that foster life, the continuation of keeping the smooth running of it. Hence, prolonging the active connectivity in a socio-political dimension.

In fact, these software changes have been commonly seen in nowadays software culture such as hotfixes or security updates from operating systems, and software updates from other different kinds of applications. The reasons behind ranges from protecting security and privacy of users to the offering of better experiences, features and functions. Arguably, one of the hidden agenda for software companies is to implement a range of mechanisms to reinforce the controlling, monitoring and optimizing via data tracking. Facebook is one of the examples that actively analyze user behaviors, such as tracking users’ cursors on screen behavior that has been reported by New York Times[6] in 2013. As such, software should also be considered as a control apparatus, with power that exercises on individual live connection -as life- through the black box of algorithms. These micro tracking techniques are implemented down to individual level that drives potential consumption. All these controls are hidden but integrate in the normal release of software that offers an incompleted and distorted picture to its members. According to Foucault, the notion of life is biopolitical and consists of disciplinary power that is “centered on the body as a machine”. He explains the power as:

“its disciplining, the optimization of its capabilities, the extortion of its forces, the parallel increase of its usefulness and its docility, its integration into systems of efficient and economic controls, all this was ensured by the procedures of power that characterized the disciplines” (p.139).

Social Reproduction: extension of life via third-party production

As a production platform, Facebook’s population includes not only healthy users who frequently and actively engage with Facebook from a user perspective, but also those external communities who participate in developing Facebook apps. Facebook offers the API and a whole set of comprehensive guidelines, allowing developers to freely use their materials and produce an extension of Facebook in a creative way. For example, a mobile app Candy Crush Saga has implementing a life system, when players have used up all their default maximum number of given lives, lives are able to restore by asking Facebook friends for help. This social interaction of the give and take lives have been implemented via the Facebook API by posting a request message on a friend’s wall, as well as accepting the help message by a friend. Due to the availability of the API, the social reproduction is made possible via third-party software.

Accessing to Facebook databases with the API is motivating to developers, it immediately creates network of relations via individual behavior, for instance, the actions of like, post or share. Gerlitz and Helmond would describe this as “interconnected” (7) network relation, whereby Facebook data keep circulate among network of networks exponentially.  They point out that Facebook is intentionally implemented in their business as part of Zuckerberg’s agenda, which is “to build a more comprehensive map of connections and create better, more social experiences for everyone”[7]. Thus, this social reproduction is in conjunction with wealth and desire, producing “subjectivities” like “needs, social relations, bodies, and minds” (Hardt & Hegri 32). In view of a social-technical and social-political context, API is contributing to the liveness of both developer’s software and Facebook platform. To Facebook, these new relations are enriching the entire business of Facebook. Apps user will be recruited and attracted through third-party software and hence, affecting the dynamics of Facebook population.

Third party application has to keep update in order to cope with Facebook changes and to keep up with the latest technology. In 2010, Facebook announced significant changes towards the API with the introduction of Open Graph format via the Open Graph Protocol[8], this also implies the deprecating of the former “REST API”[9]. In fact, backwards compatibility or legacy support has been seen as highly time-consuming and expensive for maintenance (Bisbal et al. 103). Therefore, companies tend not take the approach of supporting both new and old systems. Facebook as one of the listed companies also has to be cost effective in growing its revenue and business. Therefore, despite new features are no longer supported in the old API format, those existing functioning features gradually removed from the Facebook platform[10] entirely. As such, developers are forced to change their software to avoid potential malfunction.

Software malfunction: The likes of Brother Cream Cat

The production of The likes of Brother Cream Cat is prepared to suffer from potential software malfunction and hence, will lead to the software update with a newer version release. The use of web scraping technique verses standard API though can achieve the same result as an add-on, allowing Brother Cream cat permeates in the network. However, the use of different code crafting method and language is more than a technical implementation. Indeed, code has a ‘voice’ in this artistic context to maintain the liveness of the software and escape from the regulatory control by Facebook. Cox argues machine code not only regards as an instrument for executing creative instructions, but also encloses “subjectivity and sociality” that “connects with political expression and allows for a wider understanding of power relations” (3).

Indeed, the versioning of software refers to the disappearance of old interfaces, old functions, old regulations and policies regardless of any reasoning behind. In The likes of Brother Cream Cat, a malfunction add-on means the death of live connections with Facebook in a literal sense. Though leaving the add-on aside, still changes and forces could be potentially acted upon the software without the interference of the artist. In other words, the liveness of this software is subject to the code changes by Facebook. Using web-scraping technique might prolong its life and escape from the disciplinary practice of Facebook, but still hardly escape from the frequent code changes and releases of the Facebook Empire.

Conclusion:

The notion of liveness that has been discussed is to move beyond representation of an artifact. In this article, liveness refers to the matters of life, the ability to maintain a live connection with others. In a technical level, it is about the real time connectivity between systems or applications. However, liveness is more than technical understanding and I have argued that it comes with other forces. In terms of thinking through the notion of software version in the context of networked environment, on the one hand, it encompasses different functions, interfaces and logics to help interacting with the world; on the other hand, it documents the social forces that alter the process of data interaction. It offers a point of departure to rethink the inclusions and exclusions from the software provider. These social forces include the notions of disciplinary practices and software ideologies. As such, investigating the socio-political and socio-technical forces that exist behind, and beyond, the screen representation help to understand the constitution of liveness in software.


[1] The data is gathered from the Brother Cream Cat’s Facebook fan page: https://www.facebook.com/pages/%E5%B0%96%E6%9D%B1%E5%BF%8C%E5%BB%89%E5%93%A5/117969648299869

[2] The genre of network art is usually grouped under the bigger umbrella of media art, meaning art based in or on Internet Cultures, examining the everyday medium of the Internet.

[3] See the full Facebook developers’ policy here: https://developers.facebook.com/policy/

[4] The data is collected as of 25 Nov 2013 through the website: https://www.apichangelog.com/api/facebook

[5] See the statement appears in the “Facebook Reports Third Quarter 2013 Results” here: http://investor.fb.com/releasedetail.cfm?ReleaseID=802760

[6] See the news report, Facebook Tests Software to Track Your Cursor on Screen, here: http://blogs.wsj.com/cio/2013/10/30/facebook-considers-vast-increase-in-data-collection/

[9] See the details of the deprecated REST API here: https://developers.facebook.com/blog/post/616/

[10] See the Facebook developer roadmap here: https://developers.facebook.com/roadmap/completed-changes/

//////////////////////////////////////////////////////////////////

Works cited [incomplete + adding footnote into references]

Agamben, Giorgio. Homo Sacer: Sovereign Power and Bare Life. California: Stanford University Press,1998. Print.

Auslander, Philip. Liveness: Performance in a mediatized culture. 2nd ed. Oxon: Routledge, 2008. Print.

Berardi, F., 2011. Conference: BODY: RESPONSE – Biomedial Politics in the Age of Digital Liveness. Web. 31 Jul 2013. < http://www.transmediale.de/content/conference-bodyresponse-%E2%80%93-biomedial-politics-age-digital-liveness >

Bisbal, Jesus et al. Legacy Information Systems: Issues and Directions. IEEE software, 16(5), 1999 pp.103-111. Web. 27 Nov 2013. <http://csis.pace.edu/~marchese/CS775/Proj1/legacyinfosys_directions.pdf.

Bourdon, Jerome. Live television is still alive: on television as an unfulfilled promise. Media, Culture & Society, 22(5), 2000 pp.531-556.

Bucher, Taina. Objects of intense feeling: The case of the Twitter API. Computational Culture: a journal of software studies, 2013. Web. 27 Nov 2013. <http://computationalculture.net/article/objects-of-intense-feeling-the-case-of-the-twitter-api>

Caliskan, Koray and Callon, Michel. Economization, part 2: a research programme for the study of markets. Economy and Society, 39(1), 2010  pp.1-32. Print.

Cheang, Shu Lea. UKI, available at http://www.u-k-i.co/viralgame1/. 2009. Web. 27 Nov 2013.

Chuck, Rossi. Ship early and ship twice as often. Facebook. 2012. Web. 27 Nov 2013. <https://www.facebook.com/notes/facebook-engineering/ship-early-and-ship-twice-as-often/10150985860363920>

Cox, Geoff. Speaking Code: Coding as Aesthetic and Political Expression. MIT Press, 2013

Derrida, Jacques & Stiegier, Bernard. Echographies of Television: Filmed Interviews. Cambridge, Oxford, Malden: Polity Press, 2002. Print.

Donati, L. Paraguai & Prado, Gilbertto. (2001). Artistic Environments of Telepresence on the World Wide Web. Leonardo, 34(5), 2010 pp.437–442. Web. 27 Nov. 2013. <https://www.academia.edu/1062274/Artistic_environments_of_telepresence_on_the_world_wide_web.>

Feuer, Jane. The Concept of Live Television: Ontology as Ideology. In Regarding Television: Critical approaches – An anthology, ed. Ann E. Kaplan. Los Angeles: The American Film Institute, 1983. Print.

Foucault, Michel. The History of Sexuality. Volume I: An Introduction. Tr. Robert Hurley. New York: Vintage Books, 1978.

Gere, Charlie. Art, Time and Technology. Oxford, New York: Berg, 2006. Print.

Gerlitx, Carolin & Helmond, Anne. The Like economy: Social buttons and the data-intensive web. New Media & Society, 2013. Print.

Hardt, Michael & Hegri, Antonio. Empire. Cambirdge, Massachusetts, London: Harvard University Press, 2001. Print.

Harris, Jonathan & Kamvar, Sep. We Feel Fine, available at http://www.wefeelfine.org/  2005.

JODI. GEO GOO, available at http://geogoo.net/ 2008.

Karppi, Tero. Digital Suicide and the Biopolitics of Leaving Facebook. Slow Media, (20). 2011. Web. 27 Nov 2013.  < http://www.transformationsjournal.org/journal/issue_20/article_02.shtml>

Liu, Shih-Diing. The emergence of Internet biopolitics in China. 思想 (11), 2009 pp.57-77. Web. 27 Nov 2013.  <http://www.academia.edu/614380/_The_emergence_of_internet_biopolitics_in_China>

Lodi, Simona. Illegal Art and other stories about social media. In Unlike Us Reader: Social Media Monopolies and their alternatives, eds, Geert, Lovink and Miriam, Rasch. Amsterdam: Institute of Network Cultures, 2013.

Manovich, Lev. Software Takes Command. New York, London, New Delhi, Sydney: Bloomsbury, 2013. Web. 27 Nov. 2013. <http://issuu.com/bloomsburypublishing/docs/9781623566722_web>

Marino, Mark. C. Critical Code Studies electronicbookreview electropoetics. 2006. Web. 27 Nov 2013. <http://www.electronicbookreview.com/thread/electropoetics/codology>

Marriot, Stephanie. Time and time again: ‘live’ television commentary and the construction of replay talk. Media, Culture and Society, 18(1), 1996 pp.69-86.

Munster, Anna. From a Biopolitical “Will to Life” to a Noopolitical Ethos of Death in the Aesthetics of Digital Code. Theory, Culture & Society, 28(6), 2011 pp.67–90.

Noortje, Marres & Esther, Weltevrede. Scraping the Social? Issues in real-time social research. Journal of Cultural Economy, 6(3), 2013 pp.313-335.  Print.

Tseng, Chun-hsiung. Virtual Browsing Environment for Mashups. International Conference on Advanced Information Technology AIT, 2011.

Parikka, Jussi. Digital Contagions: A media archaeology of Computer Viruses, Peter Lang International Academic Publishers, 2009. Print.

Parisi, Luciana. Contagious Architecture: Computation, Aesthetics, and Space. Eds. Massumi, Brian and Manning, Erin. Cambridge, Massachusetts, London: The MIT Press, 2013. Print.

Pasquinelli, Matteo. Animal Spirits. 2008. Web. 27 Nov 2013. < http://matteopasquinelli.com/docs/animal_spirits_introduction.pdf>

Pasquinelli, Matteo. Google’s PageRank Algorithm: A Diagram of the Cognitive Capitalism and the Rentier of the Common Intellect. In Deep Search, ed. Felix, Stalder. London: ransaction Publishers, 2009. Print.

Scannell, Paddy. Radio, Television & Modern Life.  Oxford, Massachusetts: Blackwell Publishers Ltd, 1996. Print.

Soon, Winnie & Pritchard, Helen. The likes of Brother Cream cat, available at http://project.arnolfini.org.uk/brothercreamcat 2013.

Soon, Winnie & Bevington, W.M (ed). The Public Interface as an Art-Making Enabler. Parsons Journal for Information Mapping, 3(4), 2011 pp.1-7

Thorp, Jer. Art and the API. 2013. Web. 27 Nov 2013. http://blog.blprnt.com/blog/blprnt/art-and-the-api

Zemmels, David. Liveness and Presence in Emerging Communication Technologies. 2004. Web. 27 Nov. 2013. <http://david.zemmels.net/scholarship/Comm7470.html>

CRITICAL INFRASTRUCTURE

“Environments are invisible. Their groundrules, pervasive structure,
and overall patterns elude easy perception.”
“If a work of art is to explore new environments, it is not to be regarded
as a blueprint but rather as a form of action-painting.”
—Marshall Mcluhan
(Mcluhan 1967, 68; 1987, 325)

CRITICAL INFRASTRUCTURE - Survey Image

Infraduction

The essay and ideas included here is a discussion of the topics raised through CRITICAL INFRASTRUCTURE, an artistic research and production residency that took place as part of the lead up to the transmediale festival, afterglow, 2014. The project’s initiation was about uncovering the resources and reserves of physical and material energies, signals and data that scaffold the very possibility of post-digital art-and-technology practices. Through a series of public workshops, and an installation project situated within the transmediale 2014 festival, CRITICAL INFRASTRUCTURE’s ‘post-digitality’ is not only historical-temporal, but immediate, and dredged up from below, in the present. The artistic project stemming from research and public events through the project creates a media-archaeological site-survey, revealing data and depth of the present moment of an art and technology festival, in the Haus der Kulture der Welt, in Berlin, on Earth. As such, the project intends a kind of post-digital institutional critique, as well as reflecting something of the “geological-turn” in media and media theory through the landscape survey form. When “data mining” and circuit-bent archeologies (Parikka and Hertz, 424), are powerful metaphors and methods for artistic knowledge practices, we perform a survey of the media-technical landscape.

The project spanned the Autumn of 2013, and received the gracious support of the Canada Council of the Arts and the Danish Arts Council, and hosted by transmediale 2014 and the Zentrum für Kunst und Urbanistik (ZKU), Berlin.

Post-digitality and Infrastructure

“… a new poetics giving flesh to a ‘voice from below’, an eloquent voice of the mute. It purported to decipher the signs written on faces, walls, clothes – to travel under the visible stage and disclose the secrets hidden underground.”
— Jacques Rancière
(Rancière, 15) 

If there is something of value in seeking out what “post-digital” might mean for, artists, technologists, and researchers, we first and foremost think it temporally. That is, what we grasp at is ‘afters’ and ‘befores’—placing developments and destinies along imagined timelines. Going “post-” presupposes a hopeful and helpful epochal exit-strategy of lateral reasoning and longitudinal conclusions. Post-digitality smudges across the many real and re-imagined tendencies and nostalgias, regularities and inconsistencies that lie in the wake of a dampened digital euphoria. The result, in our current moment, seems to favour a very tight cybernetic loop, as we re-visit, re-wire, re-create, re-source, re-new, and re-surface the dreams and nightmares of 20 years of somehow anticlimactic technological emissions. The overly enthusiastic 20-something ages into a seasoned, skeptical 30-something, embarrassingly sweeping the dusts of digital idealism from the 1990’s and 2000’s under an IKEA rug. But this dust sifts its way back up through the weft and weave—and we, as with other techno-utopic waves and generations before us, are called to wonder, “What happened?”

With CRITICAL INFRASTRUCTURE, alongside time-based concepts, we speculate another “way of seeing” the post-digital: to look down, into and through the sediments of a technological present we re-main a re-action to. If “post-” usually refers to that which comes after, let’s look here at what lies below—charting a course not in terms of eras, generations and epochs, but through layers, vertical gradients, veneers and strata—driving our “post-” into the ground. The afterglow, the hangover, of the digital booms and busts we have been experiencing since the late 80’s evidence a very real layering of matter: the dirt and dusts of the digital systems, interconnects and protocols that now wrap the Earth. What matters (that is, presents itself with all its material agency) is technical-trash, overfilled (an)archives, dendritic digital distensions—the bursting at the seams of attentional and intentional gutters.

These gutters of dirt and dust are passageways to geologically thinking, pointing to the “anthropocene,” our current geological age (during which humans and our activities have dominant influence over climate, environment). Our  contributions to the geological record over the course of this era will primarily show the effects of technical media: the electrification, then wiring, then wirelessing, of the globe. For material reminders, consider how the modern engineering concepts of backward-compatibility and innovation, respectively, resonate with proto-geoscientist Steno’s 17th-Century stratigraphic laws of superposition and cross-cutting: “At the time when the lower stratum was being formed, none of the upper strata existed,” and “If a body or discontinuity cuts across a stratum, it must have formed after that stratum.” (Brookfield, 143) CRITICAL INFRASTRUCTURE, a project of methodological and conceptual misappropriations, extends the work of geological and archeological media thinking. How might we perform a coredrill of media and its technical systems?

CRITICAL INFRASTRUCTURE?

“…infrastructure is not a substrate which carries information on it, or in it, in a kind of mind-body dichotomy. The discontinuities are not between system and person, or technology and organisation, but rather between contexts.”
—Star & Ruhleder
(Star and Ruhleder, 114)

The mercurial character of technical infrastructure is what renders it critical in two ways. These constellations of technologies are by definition ceaseless and foundational, in the way that the U.S. Department of Homeland Security describes them: “Critical infrastructure are the assets, systems, and networks, whether physical or virtual, so vital to the United States that their incapacitation or destruction would have a debilitating effect on security, national economic security, national public health or safety, or any combination thereof.” (Homeland Security Website) But they are also, in a sense critical of themselves, unstable and doomed ultimately to breakdown and failure. Paul Virilio puts frames the broad, pharmacological relation of infrastructures this way: “When you invent the ship, you also invent the shipwreck; when you invent the plane you also invent the plane crash; and when you invent electricity, you invent electrocution…Every technology carries its own negativity, which is invented at the same time as technical progress.” (Virilio, 89)

Looking at the post-digital as infra-digital (below-digital, sub-digital), outlines a superorganism. It is an image of the technical that intends to take account of specific contexts and micro-relations of both creation and use. A post-digital minerality, or elementally shows the desire, the need, to bring the digital euphoria that erupted twenty years ago down to size, down to protocol, down to implementation, down to its gritty, grimy details. The depth of the problems created and solved with technical media might require an engagement with them that is unseductive, respectful, humble—even boring. Contemporary creative practices give account of the resurgence of these purportedly boring things, having renewed resonance and interest. Online culture and art making that we identify as post-digital overflow with concern for the mundane object, the muted image, simple interactions. For examples, load up a few Tumblrs: “Things Fitting Perfectly Into Other Things” (http://thingsfittingperfectlyintothings.tumblr.com) or “The Jogging” (http://thejogging.tumblr.com), with its particular brand of Duchampian manoeuvring. Jack Strange’s 2008 exhibition work ‘g’—an exhibition piece where a lead ball is placed on the ‘g’ key of a Macbook laptop—places technological dullness on a pedestal. Gone is the art-and-technology of “New Media Artist,” aiming at some terrifically preposterous future of art, or of the media. Technical media is composed of embarrassingly simple and commonplace, repeated elements (the micro-switching of a WiFi router, the ordinary hand-to-mouse gestures of a film editor, etc.). The exciting exhilaration of “Where do you want to go today!?” digitality is set against its monstrous monotony: The repetition of keystrokes, clicks, logic gates, ethernet routers and seemingly never-ending lists. (“Where do you want to go today?” was Microsoft Corporation’s global campaign slogan for most of the mid 90s.)

'g' (2008), Jack Strange

‘g’ (2008), by Jack Strange. A “g” key of a laptop is held down by a lead ball, repeating the letter into a Microsoft Word document.

There is a thing that exists in the world, a half-serious post-digital counter-strike, known as “The Society for People Interested in the Study of Boring Things.” One of The Society’s charter members, Susan Leigh Star, has described their activities, characteristically, as a list of things: “Among the boring topics presenters brought to the table were: the inscription of gender in unemployment forms used by the city government in Hamburg, Germany; the difficulties of measuring urine output in a post-surgical ward in the Netherlands, and how to design better cups for metrication; the company mascot and the slogans used by a large Midwestern insurance firm in its attempts to build corporate cultures; and how nematologists use computers to keep track of their worm specimens.” Star continues that, “what they have in common is a concern with infrastructure, the invisible glue that binds disciplines together, within and across their boundaries.” (Star Got Infrastructure?) 

Relying on, and extending Star’s discussions of infrastructure elsewhere (Star The Ethnography of Infrastructure), we can sketch an outlines of a concept of infrastructure that is full of contradictions. Infrastructures are:

  • embedded, but give themselves to experience as secreted access points;
  • transparent in terms of how we use them, but opaque in terms of how they work;
  • articulated at human scale but operational only at much larger and smaller scales;
  • material and systemic, as well as learned and practiced;
  • locally articulated, but rely on a globally “installed base”
  • designed to be reliable and established, but existentially insecure, unpredictable and precarious

The infrastructures of media-technics, is a lively area for cultural and artistic activities, and realist, unidealized approaches to creative work. What we provide with art-and-technology are “punctualized building blocks,” (Parikka and Hertz, 427) and condensation points for the misty haze of technology as it ascends into “the cloud.” We can no longer study or use a thing called technology: “Think of technology as a verb, not a noun.” (Red Burns) Likewise, we can never claim to step outside of the technological: “I don’t see an outside, but see technology everywhere, even where it purportedly is not… Is it never not on?” (Ronnel The Fable of Media Technology) Using Heidegger’s terminology to discuss the experience of use, and the design of informational systems, Star writes: “Within a given cultural context, the cook considers the water system a piece of working infrastructure integral to making dinner; for the city planner, it becomes a variable in a complex equation. Thus we [should] ask, when—not what—is an infrastructure… infrastructure occurs when local practices are afforded by a larger-scale technology, which can then be used in a natural, ready-to-hand fashion.” (Star Steps Toward an Ecology of Infrastructure)

A fascination for infrastructure in art making can serve to point out the links between institutional, economic and political structures, and commonplace and material systems. These “always-on” systems allow for, and (to a lesser degree) are allowed by, art-and-technology practices. These banal systems are what we are not supposed to care about, not supposed to notice, while awestruck and immersed, blown-away by the spectacle, the narrative, the classically aesthetic. What lies beneath? “You wouldn’t be interested,” anyway. And if we we do notice these underlying systems, then something has gone, often terribly, wrong. Infrastructural technologies are like DJ’s—you only realy notice them when they suck.  CRITICAL INFRASTRUCTURE is a characterisation of the technological that shares much in common with the Critical Engineering Manifesto, prescriptive instead of the technologist :

“The Critical Engineer looks beyond the ‘awe of implementation’ to determine methods of influence and their specific effects.”
— J. Oliver, G. Savicic, D. Vasiliev
(The Critical Engineering Manifesto)

When something works—really works—it becomes infrastructure. We give this name to something we are not enough aware enough normally to name at all. As Douglas Adams has put it, “Technology is a word that describes something that doesn’t work yet.” (Adams How to Stop Worrying and Learn to Love the Internet) So, infrastructures are at once easily detected and indiscernible — they are everywhere and nowhere, at once. These dynamics of appearance and disappearance, of visibility and invisibility are perhaps somewhat fundamental to what is to be technological. But there are other ways and reasons that technologies disappear, and some of are motivated by worrying realpolitiks of knowledge and access, as well as social relations incumbent of late capitalism.

The Infrastructure of Institutions / Institution of Infrastructure

There are significant impediments to understanding large and complex technologies, and one mode of invisibility is here brought about through a purposeful projection of tedium. For example, “one of bureaucracies’ most effective, least appreciated weapons is its tedious technical reports. Like frigid February elections in Chicago, these fat volumes dissuade all but the most faithful.” (Espeland, 109) There is a particular colour of grey used in the telecommunications industry that, at least in industry folklore, has been psychologically proven to be the world’s most boring colour. This cognitive camouflage marks everything technological that is intended to be uniformly dull and uninteresting. The seemingly colorless cross-connection boxes that stand aloft in the urban landscape are like tombstones of a bygone digital era, an invasive species we aren’t supposed to notice the presence of. Fuller and Goffey define “grey media” as those, “databases, group-work software, project-planning methods, media forms, and technologies that are operative far from the more visible churn of messages about consumers, empowerment, or the questionable wisdom of the information economy.” (Fuller, 9)

Sichert Product Palette

The Sichert family of cross connection and KVz—Kabelverzweiger, or “Cable fan out’—cabinets,  for outdoor use. These grey boxes are used to connect trans-regional and trans-national telecommunications infrastructure to individual subscribers and households, known in the industry as “the last mile.” (Image with the explicit permission of Julian von Hardenburg, Berthold Sichert GmbH management — http://sichert.com)

“Networks can no longer be conceived of as intrinsically utopian. On the contrary, they are now the third terrain (alongside nations and markets) on which the bitter competition for wealth and power are undertaken… they retain, in layers, older formations – network security, network discipline, and network sovereign power over life and death.”
— Sean Cubitt
(Cubitt, 312)

Infrastructures and institutions are related: they are conjoined twins—the former generally thought to be the latter’s more obstinate, material counterpart. The practices of institutions create and sustain infrastructures, and, reciprocally, institutions require the channels and stratifications scaffolded by them. If infrastructures order and delimit a kind of imperceptibly-opaque, fragile, material-technological hyperobject (Morton, 130),  institutions do the same kind of work for social, political and even personal life. Infrastructures and institutions may not be so different, beneath their commonplace surfaces:

“’an idea or something that has been learned can also be considered as having material-objective force in its consequences and mediations,’ the understanding of the material nature of ideas, and their relation to medial activity such as reading, navigation, and calculating, has become commonplace.”
— Matthew Fuller, Evil Media
(Fuller, 214)

And this is where a tension between impressions and realities, a politics of knowledge, at individual and community scales, becomes highly pronounced. Bureaucracies and institutions express a set of techniques that are also present in the design and development of technical infrastructure: abstraction, compartmentalisation, classification, oblivious interiorities — the list of tendentious strategies spins round and round, centrifuging imbalances of both knowledge and power.

Histories and studies of science and technology in the industrial age are witness to multifarious accounts of dangerous and productive complicities like this (Eisenhower famously terming the U.S.’s initial version of such an infrastructure the “military industrial complex” as early as 1961 (Eisenhower, Farewell to the Nation)). A more personal, illustrative account comes from Colleen Black, one of 75,000 residents of Oak Ridge, Tennessee, who’s war-time period in America was spent unwittingly processing uranium for the bombs dropped on Hiroshima and Nagasaki in 1945. When asked how almost the entire population of the town could have worked in the processing facility, without knowing its incendiary purpose: “You’d be climbing all over these pipes, and testing the welds in them. Then they had a mass spectrometer there, and you had to watch the dials go off, and you weren’t supposed to say that word, either. And the crazy thing is, I didn’t ask. I mean, I didn’t know where those pipes were going, I didn’t know what was going through them … I just knew that I had to find the leak and mark it.” Ms. Black is here speaking of a fearsome impedance matching sometimes achieved by institutions and infrastructures. When capitalism, its institutions, and comprehensive technologies collude, no one needs to know anything: “If somebody was to ask you, ‘What are you making out there in Oak Ridge,’ you’d say, 79 cents an hour.” (National Public Radio, Secretly Working To Win The War In ‘Atomic City’)

Godspeed You Black Emperor! - Yanqui U.X.O. Album Art (back)

Godspeed You Black Emperor!’s Yanqui U.X.O. back cover, showing  relationships between music publishing and recording industries and the military-industrial complex.  (Used with the permission of Don Wilkie—Constellation Records, Montreal, Canada.)

So, nobody gets to know everything. Technologies, when they become infrastructural, are never fully understood by any one. Try asking a car mechanic to fix household plumbing, a supercomputer programmer to reconfigure a Microsoft Windows network, or a WordPress php coder to build a robot. There are vectors of re-integration, signs of domain hopping, but by and large and more and more we just have to “find the leak and mark it,” and wait for the cable repair man to show up. And these contradictorily interdependent-autonomies manifest themselves all the way down. The telecommuting MacBook Pro graphic designer and the resident of a developing-world megacity are different in every way, save this: each is subject to the imposed vulnerability and inflicted impotence of institutional, technical infrastructures. The result is a devolving chain of irresponsibility (where responsibility is “the ability to respond,” as well as its more common meaning). As these infrastructural systems ascend from our physical, then from perceptual, then our conscious realities, we are called upon to think about them less and less, and the consequences get more and more gnarly. It get to the point that even when we would like to find out where the pipes are going, and what is going through them. When confronted with highly complex technological systems, “individuals [are] simply incapable of bearing full responsibility for their effects,” as Jane Bennett discusses in attempting to trace causal logic (blame) to the North American power blackout of 2003.  (Bennett, 24)

Globally, the scaffolding of institutional and governmental power through technological artefacts, often taking the form of territorialisation through instrumental measurement, has long been part of the infrastructural bargain. Techniques include, “dependence on imported equipment rather than self-sustaining networks, and an absence of R&D in the colonized territory.” For electrical power, for example, these are “techniques which keep the regional power companies in thrall to larger global corporate networks of goods and services.” (Cubitt 314)  Information and network archivic infrastructures work in the much the same way—cartographic mapping and scientific investigation (as “quantification” movements of the 18th and 19th centuries) were serviceable preludes to Western European powers’ dominion over the new world, the Indian subcontinent and Africa, among others. German and British geographers, map makers and natural scientists certainly thought themselves to be doing a great, inherent service to the world. And the preplanning of today’s contemporary superpowers seems no less an irreproachably admirable bargain: Google just wants to know, and we just want free email.

Measuring Infrastructure

“Whenever things were frightening, it was a good idea to measure them.”
—Daniel Kehlmann, Measuring the World
(Kehlmann 16)

The promise that base metals supposed for the alchemist, and the capacities that scryers gave to globes of rock crystal, is the promise that “data” brings to our present moment. Richard Wright’s essay for “Software Studies, A Lexicon” (2007), points to the archive fever and historical anxiety from which contemporary techniques of data visualisation arose: “In 1987 the US National Science Foundation published their “Visualisation in Scientific Computing” report (ViSC) that warned about the “firehose of data” that was resulting from computational experiments and electronic sensing.” (Fuller 78) Artists, “creative technologists,” designers, programmers are, right this moment, developing an enormity of alternate perspectives on comma delimited lists, spreadsheets and other seemingly humdrum data formats and sources. The tools they employ often involve a surprisingly potent mix of simple statistical techniques, aesthetic schemes, and data massaging.

But the whole endeavour reveals a quintessential epistemic irony of our data-age: Data is collected in order to characterise the truth of an object or event. But, having collected too much data, of a kind that is impossible to comprehend directly, we elaborate a whole literature of symbols, infographics, explanations and visualisations.  As Vilem Flusser puts it, “…every mediation between man and the world, [is] subjected to an internal dialectic. They represent the world to man but simultaneously interpose themselves between man and the world (“vorstellen”). As far as they represent the world, they are like maps; instruments for orientation in the world. As far as they interpose themselves between man and the world, they are like screens, like coverings of the world.” (Flusser 2007) We drill-down, slice and sieve the database —digital dowsing, attempting to “strike oil,” or to “sift gold” from these stratifying datasets. And here again is why geological thinking is more than an inter-disciplinary conceit. We find ourselves inventing a new tectonics of the database, an elaborate succession of measurements and multiple-working-hypotheses, that we hope will bring us closer to the realities we seek to characterise. But, there is much to be said for the insights wrought by perspectivally looking at the data. Perhaps “a landscape is best viewed with a single source of light—the sun, one light bulb, a lone candle, a lone writer – so that all the shadows and highlights are true to each other.” (Coupland Extraordinary Canadians Marshall Mcluhan) In order to study something highly non-linear, perhaps we must first arrange it, slice through it, in or with a line.

Infrastructures, networks of materials and people, piping and protocols, seem a favorable source for ever more data, to be distilled and visualised. Operating at the dashboard — via interfaces that try to convey new understandings via illustration — we can decide to engineer awareness in almost innumerable ways. Can we imagine an “infrastructural proprioception” of a kind similar to the “social proprioception” that the social media allows for? (Thompson Clive Thompson on How Twitter Creates a Social Sixth Sense) There will exist a data-space for infrastructure, all the way up, and all the way down. It would seem that withdrawn technological entities call us toward toward then, inevitably in this way:

“Thus what is a mere procedure of mind in the translation of sense-awareness into discursive knowledge has been transmuted into a fundamental character of nature. In this way matter has emerged as being the metaphysical substratum of its properties, and the course of nature is interpreted as the history of matter.”
—Whitehead, The Concept of Nature
(Quoted in Latour, What is the Style of Matters of Concern)
(Whitehead 16; Latour 43)

Performing Infrastructure

Technology slips from the invisible to the visible in a number of ways, some already outlined, and some more intentional and performative than others. The most obvious is perhaps through internal or external failure. This breakdown, as self-critique by and of infrastructure itself, is a reading that Sean Cubitt gives of Mcluhan’s influential description of electric light: “The electric light is pure information. It is a medium without a message.” (Mcluhan 1964: 15) Infrastructural breakdown, here the example and existentialism of electricity and light, can be “an assertion of the criticality of the medium to our innately communicative species.” (Cubitt 15) When a large power blackout happens, it increasingly means a complete severing of all cultural communicative ties—arenas for public and private interactions are artificially lit, and social spheres (in the West, at least) are nearing complete metastasis from situated to networked, analog to digital, neighbourhood to online.

More interesting than breakdowns are instances where infrastructural performers and human actors do a more explicit double-act. A favourite story regarding such a vaudevillian ploy involves one Harvey Schultz of New York City. During a press conference in advance of the 1987 National Football League Super Bowl game, Schultz hinted to the public at large that it might be a good idea for football fans to “stagger their bathroom visits” during the game — so as to avoid a potentially hydraulically catastrophic “Super Flush.” The exacting news outlets of the moment took the story and ran with it. Hearsay about the Super Flush is an important mechanism for rendering of infrastructure in the minds of we who would use it unwittingly. The important thing about  Schultz’s peculiarly artful institutional critique that day at the press conference is not whether or not what he said was true (it was not), but that it made present, perhaps for the first time: New Yorkers have toilets, they are each part of an massively interconnected system,  all connected to an otherwise unnoticeable aqueduct .  Schultz did no less than to render the infrastructure of plumbing and sewage visible, in the consciousness of millions of people. 

The Tri-City Herald - Super Bowl flush warning - January 25th, 1987

The Tri-City Herald article from January 25th, 1987, reporting on the possibility of a “Super Flush” occurring due to toilet activity during the Super Bowl football game. Harvey Schultz, then New York City’s Commissioner of Environmental Protection, urged “Don’t rush—and think before you flush.”

Along with breakdowns (hoaxed or otherwise), we could add a further mode to the ways in which infrastructures move from the mysterious to the manifest. Correlation, a process known to statisticians and scientists that serves to establish links between data derived from individual processes, can further serve to elucidate infrastructures. Marshall Mcluhan expressed correlation in a more felt manner, emphasizing an underlying inclination of systems and people toward patterns and connectivity: “When information is brushed against information… the results are startling and effective. The perennial quest for involvement, fill-in, takes many forms.” (Mcluhan 1967:103)

Consider a phenomenon known to exist in the United Kingdom power industry known as “Television Pickup.” By quite a large majority, the English like to make tea, and watch television drama. Whenever a particularly popular drama or sport programme on the BBC ends, the entire viewing public gets up from their television and makes tea. During these mass-brew events, millions of electric kettles are turned on all at once, just prior to which the national electrical grid system goes into mini-emergency mode. The largest pickup recorded for the TV drama East Enders happened on April 5th, 2001, when an estimated 22 million viewers watched to find out ‘Who shot Phil Mitchell’. (BBC 2007) The post-episode power load by 2290 megawatts and the population of the UK at this time was 58.7 million. (Wikipedia  United Kingdom Census 2001). Television Pickup is a correlation between media, behaviour and electrical supply—and it is this correlation, revealing unexpected infrastructural causalities, that allows for an awareness of subsystems, and how they interrelate. (British Broadcasting Corporation Britain From Above) Through unexpected correlation and causal relationships, technologies are drawn out from their transparent fog, their immanent and pervasive haziness.

The performance of infrastructures, as the rendering present of unwitting, unwanted or unthought systems, has its place and prelude in artist practice. The methods developed by artists and activist associated with forms of “Institutional Critique,” treat institutional infrastructures of art as fodder for artworks that expose and elaborate them. Institutional Critique, serves as perforative and performative interrogation into the value and support structures of the museum, gallery, catalogue and official welcome. Amongst artist Andrea Fraser’s well-known works is “Museum Highlights: A Gallery Talk” (1989). The scripted dialogue in these interventions includes not only an exposition of art historical and aesthetic concerns, but also discussions of material infrastructure (water, electrical lighting), museum sponsorship, and cultural-economic and political agendas more widely: “Jane walks into the Coat Room, gesturing toward the drinking fountain at the far end. Addressing the drinking fountain:  Hmm, ‘a work of astonishing economy and monumentality … it boldly contrasts with the severe and highly stylised productions of this form.” (Fraser 120)

One thing that makes the work interesting is that it may not matter if what Fraser is saying is wholly accurate of factual. A narrated dataset of factoids and excerpts, the work presents an appropriately incoherent and unlocatable constellation of information and messaging (some lifted from official museum publications), that the audience is left to interpolate between and within. This is infrastructural theatre of the superorganism of the art museum, and the art world, all strings attached. But what in the post-digital landscape could be thought potent for enlivening and reinvigorating this kind of theater, that could serve as a further “new departure point for what used to be called institutional critique”? ((Holmes Extradisciplinary Investigations)

Andrea Fraser (1989), Museum Highlights: A Gallery Talk
Andrea Fraser, as Jane Castleton, highlight the water fountain as part of the Museum Highlights: A Gallery Tour at the Museum of Philadelphia, 1989.

Interminable Terminals

CRITICAL INFRASTRUCTURE—that is, technological materials that are at once constitutive of social and political meaning, while reflexively analytic and self-destructive—allow art and technology practices to move “Towards a New Critique of Institutions,” as Brian Holmes suggests, through extradisciplinary, or perhaps anti-disciplinary, approaches. (Holmes Extradisciplinary Investigations) A critically infrastructural study (as artwork, as whatever) might appropriate from the grey media of engineering, instrumentation, and technical disciplines, creating less of an artistic gesture and more of an articulation of live research. How “raw” can the “data” of an “art world” be, and how might it be performed for its artists and audiences? How might such infrastructural data be presented in public, such that we are prompted or called to draw an appropriate panoply of individual, evolving conclusions? There are no truths to be evoked, but relationships and resonances can be modelled and estimated, meanings evoked, tendencies charted: further attempts at living in a world we seek to understand. These are extradiscplinary methods and strategies, as a reassessment of the post-digital technological landscape seems necessary: An infrastructural account of the heaving, bristling detritus the digital has left in its wake.

WORKS CITED

“The Jogging” Web. Available here: http://thingsfittingperfectlyintothings.tumblr.com, accessed November 26, 2013. Web.

“Things Fitting Perfectly Into Other Things.” Web. Available here: http://thingsfittingperfectlyintothings.tumblr.com, accessed November 26, 2013. Web.

Adams, Douglas. Attributed to Bram “DNA/How to Stop Worrying and Learn to Love the Internet.” 1999. Available here http://www.douglasadams.com/dna/19990901-00-a.html, accessed November 26, 2013. Web.

Bennett, Jane. Vibrant Matter. Duke University Press Books, 2010. Print.

British Broadcasting Corporation, “Britain From Above.” 2008. Available here: http://www.bbc.co.uk/britainfromabove/archive.shtml, accessed November 26, 2013. Television programme.

British Broadcasting Corporation, “Can you have a big ‘switch off’?” 2007. Available here: http://news.bbc.co.uk/2/hi/uk_news/magazine/6981356.stm, accessed November 26, 2013. Web.

Brookfield, Michael E. Principles of Stratigraphy. John Wiley & Sons, 2008. Print.

Burns, Red. Address to the incoming Interactive Telecommunications Programme, 2002. Subsequently, Available here: https://twitter.com/WIRED/status/376847688676802560, accessed November 26, 2013. Lecture.

Coupland, Douglas. “Extraordinary Canadians Marshall Mcluhan.” Penguin Group (Canada), 2013. Print.

Cubitt, Sean. “Electric Light and Electricity.” Theory, Culture & Society 30.7-8 (2013): 309–323. Journal Article.

Eisenhower, Dwight. “Military–Industrial Complex – Wikipedia, the Free Encyclopedia.” Available here http://en.wikipedia.org/wiki/Military–industrial_complex, accessed November 26, 2013. Web.

Espeland, Wendy Nelson. The Struggle for Water: Politics, Rationality, and Identity in the American Southwest (Chicago Series in Law and Society), 109-110. Print.

Flusser, Vilém. “Our Images” (translated from the Portuguese by Rodrigo Maltez Novaes). Available here http://www.flusserstudies.net/pag/15/flusser-our-images.pdf, accessed November 26, 2013. From Vilém Flusser, “Post-History, Flusser Archive Collection,” edited by Siegfried Zielinski, Univocal Publishing, Minneapolis 2013, p. 91-98. http://www.univocalpublishing.com/books/102-post-history-by-vilem-flusser

Fraser, Andrea. “Museum Highlights: a Gallery Talk.” October 57 (1991): 104. Web. The MIT Press. Article Stable URL: http://www.jstor.org/stable/778874. Journal Article.

Fuller, M. and Goffrey, A. “Evil Media”, MIT Press, 2012. 232 pages. Print.

Fuller, M. “Software Studies: A Lexicon. Leonardo (Series)”, MIT Press, 2008. 334 pages. Print.

Heidegger, Martin. “The Question Concerning Technology”, Trans. W. Lovitt with revisions by D. F. Krell, in D. F. Krell (ed.) Martin Heidegger: Basic Writings, revised and expanded edition, London: Routledge, 311–41. 1993. Print.

Parikka, J. and Hertz, G. “Zombie Media: Circuit Bending Media Archaeology Into an Art Method.” Leonardo 45.5 (2012): 424–430. Web.

Holmes, Brian. “Extradisciplinary Investigations. Towards a New Critique of Institutions | Eipcp.Net.” Available here http://eipcp.net/transversal/0106/holmes/en, accessed November 26, 2013. Web.

Homeland Security. “About the Infrastructure Information Collection Division | Homeland Security.” Available here http://www.dhs.gov/about-infrastructure-information-collection-division, accessed November 26, 2013. Web.

Homeland Security. “What Is Critical Infrastructure? | Homeland Security.” Available here http://www.dhs.gov/what-critical-infrastructure, accessed November 26, 2013. Web.

Kehlmann, Daniel. “Measuring the World.” Quercus Publishing, 2010. Length

McLuhan, Marshall and Fiore, Quentin. “The Medium Is the Massage: An Inventory of Effects.” Gingko Press GmbH, 2011 (Original 1967). 160 pages. Print.

McLuhan, Marshall et al. Letters of Marshall McLuhan. Oxford University Press, USA, 1987. Print.

McLuhan, Marshall. The Gutenberg Galaxy. University of Toronto Press, 2011. Print.

Microsoft Corporation. “Where Do You Want to Go Today? – Wikipedia, the Free Encyclopedia.” Web. 26 Dec. 2013. Available here: http://en.wikipedia.org/wiki/Where_do_you_want_to_go_today, accessed November 26, 2013. Web.

Morton, Timothy. “The Ecological Thought.” Cambridge, Massachusetts: Harvard University Press. Print.

National Public Radio, “Secretly Working to Win the War in ‘Atomic City’.” Available here http://www.npr.org/2013/03/03/172908135/secretly-working-to-win-the-war-in-atomic-city, accessed November 26, 2013. Radio Programme.

Oliver, J. , Savicic, G., Vasiliev, D. “Critical Engineering Manifesto.” Available here http://criticalengineering.org, accessed November 26, 2013. Web.

Rancière, Jacques. “From Politics to Aesthetics?.” dx.doi.org 28.1 (2008): 13–25. Web.

Ronnel, A. Recorded address to the Medienwissenschaft group at the University of Basel. “Avital Ronell – The Fable of Media Technology: On My Watch. Available here http://blogs.mewi.unibas.ch/archiv/130, Accessed April 1, 2013. Lecture.

Star, Susan Leigh, “Got Infrastructure? How Standards, Categories and Other Aspects of Infrastructure Influence Communication.” The 2nd Social Study of IT workshop at the LSE ICT and Globalization, 22-23 April 2002. Available here: http://is2.lse.ac.uk/events/ssit2/, accessed November 26, 2013. Web.

Star, Susan Leigh, and Karen Ruhleder. “Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces.” Information Systems Research 7.1 (1996): 111–134. Web.

Star, Susan Leigh, and Karen Ruhleder. “Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces.” Information Systems Research 7.1 (1996): 111–134. Web.

Star, Susan Leigh. “The Ethnography of Infrastructure.” American Behavioral Scientist 43.3 (1999): 377–391. Journal Article.

Verilio, Paul. Politics of the Very Worst, New York: Semiotext(e), 1999. Print.

Whitehead, Alfred North, “The Concept of Nature.” Cosimo, Inc., 2007 (Original 1947). 212 pages. Print.

Wikipedia, “United Kingdom Census 2001.”  Available here:  http://en.wikipedia.org/wiki/United_Kingdom_Census_2001, accessed November 26, 2013. Web.

http://www.flusserstudies.net/pag/15/flusser-our-images.pdf

Do not Return to Sender – Why post-digital aesthetic research should actually distinguish between artist, critics, and audience

By Lotte Philipsen

One significant advantage of moving from a digital to a post-digital paradigm is that a post-digital paradigm enables us to approach art in a more open and critical way than what has been practiced in the digital paradigm. Specifically, a post-digital paradigm

allows us to conduct aesthetic research in contemporary works of art that make use of digital technology in ways that are not automatically identical to what technological or cultural research would do. Carsten Strathausen has termed the latter a ‘rational’, ‘info-‘, or ‘techno-‘ aesthetics, whose ‘heroes are Boscovich, Boole, Turing, and Bense instead of Aristotle, Kant, Hegel, or Adorno.’ (Strathausen, 59) The following will account for the neglect within the discursive framework of the digital paradigm to thoroughly address aesthetic dimensions of new art forms before moving on to investigating one primary requisite for doing so: (re)establishing an awareness of the different subject positions of artist and audience, respectively.

Techno-essentialism in a digital paradigm

In a digital paradigm analyses and debates on the role of new technology in art have had an overall essentialist character in the sense that questions asked basically centred around ‘what is “interactive”, or “networked”, or “digital” (etc.) art?’ These are good and highly relevant questions, but they lack one important component that it is now appropriate to investigate in a post-digital paradigm, that is: According to whom? Or in other words: from which specific subject position are such questions asked? From the position of the artist, the curator/critic, the user, the implied audience or the actual audience? By not explicating which subject positions are addressed when carrying out analyses of new art forms, the results of those analyses are staged as virgin born truths radiating from the works of art.

The confusion between these different subject positions results from the fact that, in the digital paradigm, academic theory on so-called new media art has tended to interpret the works of art according to technological features. Survey books on new media art or digital art are organised either as descriptions/analysis of individual artists or works or according to technological subgenres like ‘video art, ‘network art’, ‘interactive art’, ‘telepresence’ etc. (see Rush, Giannetti, Tribe & Jana, Paul, and Shanken) As a result attempts to critically investigate tendencies across different works of art do not distinguish between the specific technical features applied in a work of art and what is actually encountered by the average member of the audience.

Art404: "Five Million Dollars 1 Terabyte"

Art404: “Five Million Dollars 1 Terabyte”

Consider, for instance, the work “5 Million Dollars, 1 Terabyte” by Art 404 (exhibited at Transmediale 2012), which consist of a black terabyte hard drive exhibited in a vitrine. No matter how hard we look, smell, taste, listen or touch the hard drive, we will never be able to extract the most important feature about this work of art – the decisive factor that transforms the terabyte from a dull object of everyday life and that potentially gives rise to aesthetic experience for the audience: The fact that this particular hard drive contains illegally downloaded material worth five million dollars. The only way of becoming aware of this crucial piece of information is by reading the catalogue text or visiting Art 404’s website. Thus, in reality there is a gap between the experience gained from actually encountering the work in the gallery and from reading about it – a gap that is not really addressed in aesthetic research of the digital paradigm, but which may be taken into account in a post-digital one.

Especially the subject position of the audience seems to have been neglected in the digital paradigm insofar as audience experiences were assumed in aesthetic analyses to be identical to artist’s intention, curatorial/critical framing, or theoretical accounts of technical characteristics and potentials of new art types. If, for instance the use of a specific technology in a work of art was considered to have interactive, or critical, or alienating potentials it was more or less automatically assumed that the audience/users’ experience would correspond to those potentials without paying much attention to the fact that different contexts and subject positions invite different aesthetic considerations. In this sense aesthetic research within a digital paradigm is governed by essentialism rather than contextualism.

Requisite for a post-digital aesthetic research

The post-digital turn paves the way to, once again, consider the genuinely aesthetic potentials of works that make use of new media and technology – without automatically subjecting aesthetic experience to technology. Hence, we may now ask the ‘naïve’ questions of a radical aesthetics of reception to the field of contemporary art, such as: Are new media of aesthetic relevance in a work of art if they go unnoticed by the audience? How do we elaborate on the fact that the same work of art potentially gives rise to different kinds of aesthetic experiences depending on which subject positions (artist, curator/critic, user, audience) engage with the work and in what manners (as intended by someone else or not)? And how do we consider the aesthetic appeal of works of art whose medium is not accessible to our physical senses? In order to investigate such aesthetic questions thoroughly it is necessary to insist (once again) that the subject positions of artist and audience are separated.

But why should we still insist on a separation between the artist and the audience when the field of so-called new media art in many cases is characterised by crowd creation and interactivity that urges co-creation to the extent that such a distinction might seem irrelevant? For instance, the Ars Electronica Prix category of ‘Digital Communities’ consists of works in which such a distinction may seem absurd, since the digital communities function collectively in the participants’ everyday life.

An example could be the 2013 Golden Nica winner “El Campo de Cebada”, the name of an enclosed city square in Madrid, where residents and the council work together – on the physical place and via online social media – to define the use of the square. (Fisher-Schreiber, 200-203) No artist or artists group is credited for this ‘work’ since this is genuinely a collective project. However, when considered from an art (or at least cultural) institutional point of view – as it is the case with Ars Electronica – the prime purpose of “El Campo de Cebada” is to prompt aesthetic reflection rather than immediate function – even if it is the functional dimensions that prompt reflection.

Whereas in Madrid the square is inhabited, in the context of Ars Electronica it is ‘exhibited’, and this sole act of exhibiting automatically installs “El Compo de Cebada” as an object for reflective aesthetic judgement by others than its producers. As Thierry de Duve puts it with reference to Duchamp’s readymades: ‘[T]he sentence “this is art,” by which a readymade is both produced as a work of art and judged to be one, ought to be read as an aesthetic reflexive judgment with a claim to universality in the strictest Kantian sense.’ (de Duve, 320)

Now, participating in “El Compo de Cebada” may (or may not) result in aesthetic reflective judgements among the individuals who engage in the project on an everyday basis in Madrid, but the moment the project is framed by the Ars Electronica as an outstanding work belonging to the ‘Digital communities’ category a non-participating audience is created for the project, and it becomes an object for potential aesthetic reflective judgement to that group of people too.

Estrangement from, in Kantian terms, determined purposes, is basically the definition of art. Furthermore, any work of art (whether it makes use of digital media or not) has at least two different subject positions: the creator and the audience. The DNA of a work of art is its presentation to someone, somehow. Otherwise it is not art. Therefore, the subject position of an audience is crucial – not just to art, but also to aesthetic reflection, since, according to Kantian thinking, the latter resides in this subject position.

Futhermore, especially in the realm of so-called new media art, there are more than one audience subject position. As described by Dominic Lopes, in interactive art we may distinguish between ‘user’ (who explores a work by generating displays in a prescribed manner) and ‘audience’ (who explore a work by watching users generate displays by interacting with a work). The difference can be illustrated with reference to the work “OCTO P7C-1” by the Telekommunisten group (exhibited at Transmediale 2013). The exhibition of this spectacular ‘Intertubular Pneumatic Packet Distribution System’ was, tongue in cheek, described by the Telekommunisten as a demonstration to ‘potential investors and partners’.

OCTO at Transmediale 2013

OCTO at Transmediale 2013

In the exhibition Lopes’ term ‘users’ describes those visitors who engaged actively with OCTO by, for instance, writing/drawing/crafting messages for the postal tubes or sending/receiving such messages by communicating commands to the OCTO-staff working the distribution centre. The distinctive sound accompanying each packet’s travel through the tube system, the messages, the conversations between users and OCTO-workers etc. are all different kinds of audible, visual and sensual displays by which the user gradually explores physical and semiotic dimensions of the work (and potentially gets involved with aesthetic relations with it).

In addition to the user, who acts in accordance with a prescribed manner staged by the creators of the work, the subject position of what Lopes terms ‘audience’ are of relevance when investigating aesthetic implications of a work like OCTO. The audience do not engage directly with the work like the users do, but they watch how users interact with OCTO and they observe how displays are generated as results from this interaction. As such, the audience explores the work, too, albeit in a different manner than users (and may enter in aesthetic relations with the work).

The reason that the subject position that Lopes calls ‘audience’ has been left out of the equation in the digital paradigm, is that the potential aesthetic reflective judgement with this subject position does not fit a techno-essentialist view on new media art. An audience may experience what in the digital paradigm might be described as an ‘interactive, networked installation’ in a very non-interactive, non-networked manner. To be honest, how many of us have engaged actively, ‘face-to-face’ with all the works of art that we know and even value for having provided us aesthetic experiences? And even ‘users’, who do interact actively with a work, may have aesthetic experiences that differ from the technologically defined ones governing a digital paradigm. After all aesthetic experience is a matter of individual judgement of taste.

Towards a radical aesthetics of reception

In conclusion, post-digital research of contemporary art’s aesthetic dimensions should take as its point of departure what we may call a radical aesthetics of reception – not to be confused with what is traditionally known as aesthetics of reception of the Constance School. The difference between the Constance School’s aesthetics of reception and a radical aesthetics of reception lies in the fact that the former, as accounted for Peter Hohendahl, seems grounded in a formalism that centres on the work/phenomenon, whereas a radical aesthetics of reception would take more profoundly into account the aesthetics of Immanual Kant (aesthetic experience results from individual, subjective feelings and not from a concrete object/phenomenon) and the subject position that Roland Barthes termed the ‘reader’. Hence, in a radical aesthetics of reception there is no such thing as aesthetic meaning in the artistic texts – there is not even blanks (calculated by the artist or accidental) in the text – since all aesthetic qualities of a work derives from the receiver of the work, which therefore, ultimately becomes the work’s aesthetic (but not technical) producer.

Especially when it comes to works of contemporary art that make use of new media and technologies, which may not yet be fully culturally established, it seems obvious that the technical and cultural uncertainties surrounding the works may work to boost the potentials of ‘readers’ gaining aesthetic experiences from encountering such works due to the lack of an overall concept by which the works might be comprehended rationally. It seems, therefore, paradoxical when survey books within a digital paradigm attempt to account for the aesthetic characteristics of such works of art by subsuming them under determined technological categories. A post-digital, radical aesthetics of reception acknowledges that art’s receivers  – whether in the subject position of user or audience – may encounter works of art in ways not even imagined by the artist or the curator/critic, and that such encounters may lead to aesthetic experience (just as it may not).

 

References:

Barthes, R.: Image, Music, Text, 1999 [1977], Noonday Press. “The Dearth of the Author” pp. 142-148 and “From Work to Text”, pp. 155-164.

De Duve, T.: Kant after Duchamp, 1996, MIT Press.

Fischer-Schreiber, I. (ed.): CyberArts 2013, 2013, Hatje Cantz.

Giannetti, C.: Ästhetik des Digitalen, 2004, Springer.

Hohendahl, P.: ”Beyond Reception Aesthetics” in New German Critique, no. 28, winter 1983, pp. 108-146.

Lopes, D.: A Philosophy of Computer Art, 2010, Routledge.

Paul, C.: Digital Art, 2008, Thames & Hudson.

Rush, M.: New Media in Art, 1999 + 2005, Thames & Hudson.

Shanken, E. (ed): Art and Electronic Media, 2009, Phaidon.

Strathausen, C.: ”New Media Aesthetics” 2009, in Koepnick & McGlothlin (eds.): After the Digital Divide?,  Camden House.

Tribe, M. & Jana, R.: New Media Art, 2006, Taschen.

www.telekommunisten.net/octo/ (visited 6 Oct. 2013)

Post Digital Publishing, Hybrid and Processual Objects in Print

1. How a medium becomes digital (and how publishing did)

For every major medium we can recognize at least three stages in the transition from analogue to digital, in both production and consumption of content.

The first stage concerns the digitalization of production. It is characterized by soft- ware beginning to replace analogue and chemical/mechanical processes. These pro- cesses are first abstracted, then simulated, and then restructured to work using purely digital coordinates and means of production. They become sublimated into the new digital landscape. This started to happen with print at the end of seventies with the first experiments with computers and networks and continued into the eighties with so-called “Desktop Publishing”, which used hardware and software to digitalize the print production (the “pre-press”), a system perfected in the early nineties.

The second stage involves the establishment of standards for the digital version of a medium and the creation of purely digital products. Code becomes standardized, en- capsulating content in autonomous structures, which are universally interpreted across operating systems, devices and platforms. This is a definitive evolution of the standards meant for production purposes (consider Postscript, for example) into standalone stan- dards (here the PDF is an appropriate example, enabling digital “printed-like” products), that can be defined as a sub-medium, intended to delivering content within certain specific digital constraints.

The third stage is the creation of an economy around the newly created standards, including digital devices and digital stores. One of the very first attempts to do this came from Sony in 1991, who tried to market the Sony Data Discman as an “Electronic Book Player” [1] — unfortunately using closed coding which failed to become broadly accepted. Nowadays the mass production of devices like the Amazon Kindle, the Nook, the Kobo, and the iPad — and the flourishing of their respective online stores — has clearly accomplished this task. These online stores are selling thousands of e-book titles, confirming that we have already entered this stage.

2.The processual print as the industry perceives it (entertainment)

Not only are digitalization processes yet to kill off traditional print, but they have also initiated a redefinition of its role in the mediascape. If print increasingly becomes a valuable or collectable commodity and digital publishing also continues to grow as ex- pected, the two may more frequently find themselves crossing paths, with the potential for the generation of new hybrid forms. Currently, one of the main constraints on the mass-scale development of hybrids is the publishing industry’s focus on entertainment.

Let’s take a look at what is happening specifically in the newspaper industry: on one hand we see up-to-date printable PDF files to be carried and read while commuting back home in the evening, and on the other hand we have online news aggregators (such as Flipboard and Pulse) which gather various sources within one application with a slick unified interface and layout. These are not really hybrids, but merely the products of

‘industrial’ customisation — the consumer product ‘choice’ of combining existing fea- tures and extras, where the actual customising is almost irrelevant.

316Even worse, the industry’s best effort at coming to terms with post-digital print is currently the QR code — those black-and-white pixelated square images which, when read with the proper mobile phone app, allow the reader access to content (almost al- ways a video or web page). This kind of technology could be used much more creatively, as a means of enriching the process of content generation. For example, since they use networks to retrieve the displayed content, printed books and magazines could include QR codes as a means of providing new updates each time they are scanned – and these updates could in turn be made printable or otherwise preservable. Digital publications might then send customised updates to personal printers, using information from dif- ferent sources closely related to the publication’s content. This could potentially open up new cultural pathways and create unexpected juxtapositions. [2]

3. Printing out the web

Many possibilities emerge from the combination of digital and print, especially when networks (and therefore infinite supplies of content that can be reprogrammed or re- contextualized at will) become involved. A number of different strategies have been employed to assemble information harvested online in an acceptable form for use in a plausible print publication.

One of the most popular renders large quantities of Twitter posts (usually span- ning a few years) into fictitious diaries. “My Life in Tweets” by James Bridle is an early example, realized in 2009 [3], which collected all of the author’s posts over a two-year period, forming a sort of intimate travelogue. The immediacy of tweeting is recorded in a very classic graphical layout, as if the events were annotated in a diary. Furthermore, various online services have started to sell services appealing to the vanity of Twitter micro-bloggers, for example Bookapp’s Tweetbook (book-printing your tweets) or Tweetghetto (a poster version).

Another very popular “web sampling” strategy focuses on collecting amateur photo- graphs with or without curatorial criteria. Here we have an arbitrary narrative employ- ing a specific aesthetic in order to create a visual unity that is universally recognizable due to the ubiquitousness of online life in general and especially the continuous and unstoppable uploading of personal pictures to Facebook.

A specific sub-genre makes use of pictures from Google Street View, reinforcing the feeling that the picture is real and has been reproduced with no retouches, while also reflecting on the accidental nature of the picture itself. Michael Wolf’s book “a series of unfortunate events” [4], points to our very evident and irresistible fascination with

“objets trouvé”, a desire that can be instantly and repeatedly gratified online. Finally there’s also the illusion of instant-curation of a subject, which climaxes in

the realization of a printed object. Looking at seemingly endless pictures in quick suc- cession online can completely mislead us about their real value. Once a picture is fixed in the space and time of a printed page, our judgements can often be very different.

Such forms of “accidental art” obtained from a “big data” paradigm, can lead to in- stant artist publications such as Sean Raspet’s “2GFR24SMEZZ2XMCVI5… A Novel”, which is a long sequence of insignificant captcha texts, crowdsourced and presented as an inexplicable novel in an alien language [5].

317

There are traces of all the above examples in Kenneth Goldsmith’s performance “Printing Out The Internet” [6]. Goldsmith invited people to print out whatever part of the web they desired and bring it to the gallery LABOR art space in Mexico City, where it was exhibited for a month (which incidentally also generated a number of naive responses from environmentally concerned people). The work was inspired by Aaron Swartz and his brave and dangerous liberation of copyrighted scientific content from

the JSTOR online archive [7]. It’s what artist Paul Soulellis calls “publishing performing the Internet” [8]. All this said, the examples mentioned above are yet to challenge the paradigm of pub-

lishing — maybe the opposite. What they are enabling is a “transduction” between two media. They take a sequential, or reductive part of the web and mould it into traditional publishing guidelines. They tend to compensate for the feeling of being powerless over the elusive and monstrous amount of information available online (at our fingertips), which we cannot comprehensively visualize in our mind.

If print is quintessential of the web, such practices sometimes indulge in something like a “miscalculation” of the web itself — the negotiation of this transduction is reduc- ing the web to a finite printable dimension, denaturalizing it. According to Publishers Launch Conferences’ co-founder Mike Shatzkin, in the next stage “publishing will be- come a function… not a capability reserved to an industry…” [9]

4. Hybrids, calculated content is shaped and printed out

This “functional” aspect of publishing can, at its highest level, imply the production of content that is not merely transferred from one source to another, but instead produced through a calculated process in which content is manipulated before being delivered. A few good examples can be found in pre-web avant-garde movements and experimental literature in which content was unpredictably “generated” by software-like processes. Dada poems, for example, as described by Tristan Tzara, are based on the generation of text, arbitrarily created out of cut-up text from other works. [10] One of the members of the avant-garde literature movement Oulipo created a similar concept later: Raymond Queneau’s “Cent Mille Milliards de Poèmes” [11] is a book in which each page is cut into horizontal strips that can be turned independently, allowing the reader to assemble an almost infinite quantity of poems, with an estimated 200 million years needed to read all the possible combinations. That an Oulipo member created this was no accident – the movement often played with the imaginary of a machinic generation of literature in powerful and unpredictable ways.

Contemporary experiments are moving things a bit further, exploiting the combi- nation of hardware and software to produce printed content that also embeds results from networked processes and thus getting closer to a true form.

Martin Fuchs and Peter Bichsel’s book “Written Images” [12] is an example of the first ‘baby steps’ of such a hybrid post-digital print publishing strategy. Though it’s still a traditional book, each copy is individually computer-generated, thus disrupting the fixed ‘serial’ nature of print. Furthermore, the project was financed through a networked model (using Kickstarter, the very successful ‘crowdfunding’ platform), speculating on the enthusiasm of its future customers (and in this case, collectors). The book is a

318

comprehensive example of post-digital print, through the combination of several ele- ments: print as a limited-edition object; networked crowdfunding; computer-processed information; hybridisation of print and digital — all residing in a single object — a traditional book. This hybrid is still limited in several respects, however: its process is complete as soon as it is acquired by the reader; there is no further community process or networked activity involved; once purchased, it will forever remain a traditional book on a shelf.

A related experiment has been undertaken by Gregory Chatonsky with the artwork “Capture” [13]. Capture is a prolific rock band, generating new songs based on lyrics re-

trieved from the net and performing live concerts of its own generated music lasting an average of eight hours each. Furthermore the band is very active on social media, often posting new content and comments. But we are talking here about a completely invented band. Several books have been written about them, including a biography, compiled by retrieving pictures and texts from the Internet and carefully (automatically) assembling them and printing them out. These printed biographies are simultaneously ordinary and artistic books, becoming a component of a more complex artwork. They plausibly describe a band and all its activities, while playing with the plausibility of skilful au- tomatic assembly of content.

Another example of an early hybrid is “American Psycho” by Mimi Cabell and Jason Huff [14]. It was created by sending the entirety of Bret Easton Ellis’ violent, masoch- istic and gratuitous novel “American Psycho” through Gmail, one page at a time. They collected the ads that appeared next to each email and used them to annotate the orig- inal text, page by page. In printing it as a perfect bound book, they erased the body of Ellis’ text and left only chapter titles and constellations of their added footnotes. What remains is American Psycho, told through its chapter titles and annotated relational Google ads only. Luc Gross, the publisher, goes even further in predicting a more perva- sive future: “Until now, books were the last advertisement-free refuge. We will see how it turns out, but one could think about inline ads, like product placements in movies etc. Those mechanisms could change literary content itself and not only their containers. So that’s just one turnover.”

Finally, why can’t a hybrid art book be a proper catalogue of artworks? Les Liens Invisibles, an Italian collective of net artists have assembled their own, called “Unhappening, not here not now” [15]. It contains pictures and essential descriptions of 100 artworks completely invented but consistently assembled through images, generated titles and short descriptions, including years and techniques for every “artwork”. Here

a whole genre (the art catalogue or artist monography) is brought into question, show- ing how a working machine, properly instructed, can potentially confuse a lot of what we consider “reality”. The catalogue, indeed, looks and feels plausible enough, and only those who read it very carefully can have doubts about its authenticity.

5. Conclusions

Categorising these publications under a single conceptual umbrella is quite difficult and even if they are not yet as dynamic as the processes they incorporate, it’s not trivial to define any of them as either a ‘print publication’ or a ‘digital publication’ (or a print

319

publication with some digital enhancements). They are the result of guided processes and are printed as a very original (if not unique) static repository, more akin to an ar- chive of calculated elements (produced in limited or even single copies), than to a classic book, so confirming their particular status. The dynamic nature of publishing can be less and less extensively defined in terms of the classically produced static printed page. And this computational characteristic may well lead to new types of publications, em- bedded at the proper level. It can help hybrid publications function as both: maintaining their own role as publications as well as eventually being able to be the most updated static picture of a phenomenon in a single or a few copies, like a tangible limited edition. And since there is still plenty of room for exploration in developing these kind of process- es, it’s quite likely that computational elements will extensively produce new typologies of printed artefact, and in turn, new attitudes and publishing structures. Under those terms it will be possible for the final definitive digitalization of print to produce very original and still partially unpredictable results.

References

[1] http://en.wikipedia.org/wiki/Data_Discman [2] Alessandro Ludovico. Post Digital Print, Onomatopee, Eindhoven, 2012,

ISBN 9789078454878 [3] http://booktwo.org/notebook/vanity-press-plus-the-tweetbook/ [4] http://photomichaelwolf.com/#asoue/14 [5] Sean Raspet. 2GFR24SMEZZ2XMCVI5L8X9Y38ZJ2JD

25RZ6KW4ZMAZSLJ0GBH0WNNVRNO7GU 2MBYMNCWYB49QDK1NDO19JONS66QMB 2RCC26DG67D187N9AGRCWK2JIHA7E2 2H1G5TYMNCWYM81O4OJSPX11N5VNJ0 A Novel. PoD, 2013, 516 pages.

[6] http://printingtheinternet.tumblr.com/ [7] http://tech.mit.edu/V131/N30/swartz.html [8] http://soulellis.com/2013/05/search-compile-publish/ [9] http://www.idealog.com/blog/atomization-publishing-as-a-function-rather-than-

an-industry/ [10] Florian Cramer. “Concepts, Notations, Software, Art”, 2002.

http://www.netzliteratur.net/cramer/concepts_notations_software_art.html [11] http://en.wikipedia.org/wiki/Hundred_Thousand_Billion_Poems [12] http://writtenimages.net/ [13] http://chatonsky.net/project/capture/

[14] http://www.mimicabell.com/gmail.html [15] http://www.atypo.org/it/work/unhappening-not-here-not-now/

Prototyping the Future of Arcade Cabinet Emulation (draft)

Introduction:

This paper is a background research piece into the development of an interactive installation that prototypes a possible future trajectory for arcade videogame emulation. The project aims to explore how the experience of interfacing with complete arcade videogame cabinets can be recreated in virtual reality space. As an interactive experience it is intended to not just authentically recreate the aesthetics of the videogame input and feedback mechanisms, but also the full physical design of the cabinet, including the appearance of the enclosed game circuitry.

 

 

 

 

Emulation as Platform Augmentation:

 

An emulator is a software or hardware system that recreates the system architecture of a computer system on another platform. Through the virtual machine of an emulator it is possible to experience a computer system transplanted as a subroutine of a more advanced platform, whether it be hardware of software based. They are computers within computers.

 

Emulation is a legal grey area, and is tolerated to an extent by the owners of the emulated system. Upon boot up the MAME emulator presents a splash screen upon reminding the user that they must legitimately own a copy of the game rom they are about to load. However in practice, most users don’t actually own the rare and costly game PCBs that physical contain the game code. Instead they simply use an online search engine to obtain the required rom files illicitly.

 

Emulators replicate the functionality of a past platform while also leveraging the additional affordances offered by the emulation host. For example, MAME features a memory editor and dissembler that allows users to edit a games code as it runs, viewing changes of the end user experience immediately. In this case the emulator takes a system that was designed purely for the ‘play only’ consumer space and augments it with a developer level interface. With the additional use of an assembler package and an eprom burner, it is possible to transfer this new code creation to an eprom chip, and in turn to an arcade PCB, thus allowing the hacked game to be played through the original arcade hardware platform.

 

When a game originally designed for playback on a cathode ray tube display is presented through the clear viewfield of an LCD or LED display, its gains pixel sharp clarity, but also loses part of the original monitor colouration that was taken into consideration by game designers. The CRT filter built into the Atari 2600 emulator Stella addresses this issue, allowing for image ghosting and colour mixing that helps to partially mask the systems high level of sprite flickering. Similarly, the SLG-1000 hardware device by Arcade Forge recreates the scanlines of bulky CRT tubes on flat panel HD displays, improving aesthetic authenticity when playing classic games by embracing an outdated display limitation into an essential feature.

 

 

 

 

The Physiology of an
Arcade Cabinet:

 

In comparison to their home computers and videogame consoles, the underlying technology powering arcade videogame platforms is lesser known. Each arcade PCB is a standalone computer. These devices range from bespoke PCBs for single games such as Pong, to standards based upon home console technologies like the Sega Naomi which is closely related to the Sega Dreamcast console, to adapted PC compatible machines.

 

One main unifying standard between the disperate hardware types is the JAMMA standard. It is not the only standard of its kind, but it is the most prolific. Up until 1985 arcade game manufacturers used a variety of different wiring systems in the design of their cabinets. This lack of hardware interchangeability led to increased costs for arcade owners, who had to replace entire cabinets each time they bought a new game. The JAMMA standard agreed by the Japanese Arcade Amusement Manufacturers Association introduced a 56 pin connection for connecting game PCBs to cabinets, allowing the exchange of JAMMA PCBs between compatible machines in a manner similar to the process of swapping a game cartridge on a home console system. These pins allow the connection of a power source, speakers, monitor, coin-slot switch, and the action buttons and joysticks or other controller peripherals.

 

Structurally arcade cabinets are unglamorous, built from the same materials as their kitchenware namesakes. Indeed, Atari’s Irish operation in the 1970s bought a local furniture manufacturer to produce arcade cabinets for the European market [ link ]. Wear and tear on these wooden frames in the arcade environment has led to high collectors prices for well preserved originals. This battle damage adds character, but is also a problem for their preservation. Rust, chipped fiberboard, and split veneers all add up to heavy restoration projects worthy of a Discovery Channel show.

 

An arcade cabinet is a host shell for the game logic contained on the arcade board, and in many cases the design of this enclosure adds an additional level of atmosphere and immersion to the game that is difficult to recreate outside of it’s natural environment. At the most basic level, these enhancement typically amount to cabinet artwork and an illuminated title marquee that seek to sell the game narrative to prospective punters. At the high end of the market arcade games move close to simulator territory, adding enhancements such as hydraulics and force feedback. Many of the arcade cabinet designs by Yu Suzuki for Sega meet this level.

 

 

 

 

Recreating the Arcade Cabinet as a Digital Artifact:

 

While working at Sega Japan, Yu Suzuki was responsible for the design of several of Sega’s arcade hits, including Hang On (1985), Afterburner (1987), ThunderBlade (1987), and Out Run (1986). Each of the cabinets featured simple stand-up (SD) and also sit-down deluxe (DX) models. The deluxe models of all these videogames all brought a high level of technical and aesthetic polish to their cabinet design. For instance, the deluxe model of Hang On takes the shape of a 500lbs reproduction of a Ducati motorcyle, which the playermust lean left and right upon to steer. It is a game that demands the player to move their whole body weight to control it.

 

Suzuki’s emphasis on the physical design of the arcade game recognises that the physical design of the cabinet is the most immediate part of a games ‘attract mode’: “with arcade games, the cabinet is the most important thing. When you see a cabinet, that’s usually when you decide whether you want to play a game or not… The form is the most important thing when you buy a car, right?” Yu Suziki, Sega (Ashcroft, p.131-132).

 

In the pioneering 3d sandbox games Shenmue (1991) and Shemue II (2001) on the Sega Dreamcast console, Yu Suzuki recreated a number of his coin operated arcade videogames in the virtual space. The interactive 3d renderings of his deluxe arcade cabinets including the aformentioned Hang On and Out Run, in addition to Space Harrier (1985), which is widely credited to be the first sit down arcade cabinet. Each game is a full emulation of the original system, and the player can walk around the virtual space and inspect the design and artwork of the the arcade cabinets from different angles, all while sampling the ambiance of a 1980s Japanese arcade amusement centre.

 

Upon starting each virtual arcade game, the player viewpoint switches from a 3rd person perspective to completey replacing the playfield with the arcade monitor view. The design decision to momentarily switch out of the surrounding environment and allow the diagetic onscreen space of the emulated system to take over the host games screen space is understandable, since these sub games are not critical to the overall narrative. Also the 1998 Dreamcast hardware was already pushed to its maximum when emulating the aforementioned arcade games, so adding any image filtering or other graphical embelishments would have been beyond its capabilities.

 

 

This perspective on the monitor is developed a step further in the arcade games included as part of Grand Theft Auto: San Andreas. When a player steps up to a coin-op to play either Let’s Get Ready to Bumble, Go Go Space Monkey, or Duality, the screen is taken over by the coin-op, except that unlike Shenmue the view takes a step backwards. GTA:SA acknowledges the medium of the CRT screen, showing the tubes curvature as well as the surrounding plastic bezel.

 

 

GTA:SA modder ThePaddster has modified the arcade machine textures from San Andreas, replacing them with the artwork for Bally Midway’s Mortal Kombat (1992). Unfortunately the modification does not change the subgames, but the effect of changing the cabinet graphics is interesting and a tangible step towards a customisable, virtual arcade, where game roms manifest as digital game cabinets in a 3d space instead of 2d images in a folder.

 

 

In a visual and touchscreen interface style common to mobile and tablet conversions of arcade and console titles, Capcom’s Mega Man II on iPhone uses an onscreen representation of the arcade cabinet facade to frame it’s emulated Nintendo Entertainment System game. This style of virtual arcade machine takes a further step back from the monitor than GTA:SA, incorporating a joystick control panel as well as the game logo embedded into a representation of an arcade cabinet marquee. The additional graphics also form a necessary visual filler between the games original display ratio and the widescreen aspect of the iPhone.

 

 

The next logical step in improving experiential and aesthetic experience of the virtual arcade machine is to take an additional step back in perspective to encompass both the onscreen space and the peripheral vision of the player. While this expanded view adds distractions to the subgame experience, it can be argued that to block out the ambiance of the immediate environment causes existing virtual coin-op gaming experiences to lose a level of reality and authenticity.

 

 

 

 

 

Prototyping a Virtual Reality Arcade Machine Emulator:

 

A prototype aims to provide the experience of using a technology, whilst not necessarily using the same technology as the envisioned end product. It is intended as a demo of an arcade emulation style that goes beyond displaying the arcade artwork in a 2d form, instead actually wrapping it around a 3d model of the particular coin-op machine, while allowing the player to view the inside of the arcade machine.

 

At the time of writing, the powerful and affordable Oculus Rift development kit has made virtual reality a viable option over two decades since the first commercial attempts at immersive VR. By using a virtual reality headset the user can experience the playfield from a real-world perspective.

 

If used as part of the digital arcade prototype this would allow momentary glances at the digital arcade cabinets control panel and frame during gameplay. The player could also opt to move away from the screen and inspect the cabinet internally, viewing the PCB from the perspective of the arcade operator while accessing information on its hardware specifications.

 

The ComputerSpeilMuseum in Berlin has a Pong cabinet with plexiglass fitted to the back so that visitors can view the circuitry of the machine. This is an important consideration as the electronics of this artifact are as noteworthy a part of the interface as the controllers and audiovisual feedback. A complete VR arcade cabinet simulator should include an option to view the internal structure of the cabinet itself.

 

This internal view of the digital arcade cabinet serves three purposes. Firstly it provides an operator level interface for the user beyond the game calibration screen that allow operators to change in-game variables such as the default number of lives and difficulty levels. Secondly it demystifies the internal structure of the arcade machine, presenting the internal aesthetics of the wiring and circuitry as a visible and essential part of the overall cabinet build. The third advantage is that it provides an historical and educational document of the machine hardware that is impervious to wear and tear.

 

A real consideration for if this concept prototype were to become an actual emulation system is the workload involved in sourcing and producing 3d models. Emulation software relies on community effort for the continued updating of the source code, as well as the procurement of the less legal items such a rom files, game artwork, instruction manuals. For a 3d arcade cabinet emulator to succeed, it would need an open format that allows the community to create their own 3d cabinets, complete with exterior artwork and interior game wiring and PCBs.

 

In an exhibition setting, the VRAME installation could take the form of a minimal pedestal containing a harness for the VR headset along with a control panel using physical game controls. A square outline on the ground could signify the object now built in virtual space. The second option is to remove the controls, instead using a wireless gesture capturing system to match the players hand movements to a 3d representation of their hands in 3d space, registering collisions with the digital renderings of the control panel. Both options have their pros and cons. The gesture based version keeps the physicality of the emulated control system purely digital, allows for it change and adapt dynamically. On the other hand, the tangible controller adds a grounded, solid, yet distant link between the playing human and the cyber arcade cabinet.

 

(draft version 1.1, Oct 4th)

Post-digital: a term that sucks but is useful (draft 2)

% “Post-digital”, a term that sucks but is useful
% Florian Cramer
% October 2013

(Preliminary disclosure)

When first confronted with the term “post-digital” half a decade ago through my students, I found it – in an age of cultural, social and economic ruptures driven to non-trivial extents by computational digital technology – rather silly. Today, in the age of ubiquitous mobile devices, drone wars and the gargantuan data operations of Google, the NSA and other global players, it may appear even sillier: as either ignorance of our times or Thoreauvian-Luddite withdrawal from them. The latter option is tempting, yet naive. For the arts, they boil down to the history of the 19th century Arts and Crafts movement repeating itself, with its program of handmade production as resistance to industrialization. And indeed, this history is partly repeating itself in today’s renaissance of artists’ printmaking, handmade film labs, limited vinyl editions and what have you. But on closer inspection the dichotomy between digital big data and neo-analog DIY isn’t as clear-cut as it may seem. And this is where the attribute “post-digital” makes sense.

post-what?

From a philosophical standpoint, one can only agree with Geoff Cox and his critique of the term “post-digital” as a questionable continuation of other nouns prefixed with “post”, from postmodernity to posthistoire. I would like to frame it, however, within more pragmatic, popular and colloquial frames of references, both as regards to the prefix “post” and to the notion of “digital”. Rather than “postmodernity” and “posthistoire”, the “post” in “post-digital” could be compared to post-punk (punk culture continued in ways that both were punk and not), post-communism (still the reality in all former East block countries), even postcolonialism and, to a lesser extent, post-apocalyptic (pop cultural, Mad Max style). None these words would be done justice if one identified them as Hegelian historico-philosophical categories. Rather, they describe mutations that are often still ongoing: postcolonialism does just not mean the end of colonialism akin to Hegel’s and Fukuyama’s end of history, but its transformation into less clearly visible power structures that are still in place, have left their mark on languages and cultures, and most importantly still govern geopolitics and global production chains.

Likewise, “post-digital” refers to a popular cultural – rather than media theoretical – notion of “digital”, the kind of connotation nicely illustrated by contemporary Google image search results on the word “digital”:

Google image search result for "digital", 9/2013

Google image search result for “digital”, 9/2013

“Post-digital” first of all means any media aesthetics that leaves behind these clean high tech, high fidelity connotations. The word was coined by musician Kim Cascone in 2000 in relation to glitch aesthetics in contemporary electronic music [1]. In the same year, the Australian sound and media artist Ian Andrews broadened it into a “post-digital aesthetics” that rejects the “idea of digital progress” and “a teleological movement toward ‘perfect’ representation” [2]. Andrews, in other words, thought of “post-digital” as an antidote to techno-Hegelianism.

Both Cascone’s and Andrews’ papers were firmly based on the culture of audiovisual production. In this world, “digital” had been synonymous with “better” for a long time: the launch of the Fairlight sound sampler in 1979, the digital audio CD in 1982, the MIDI standard in the same year, software-only digital audio workstations in the early 1990s, real-time programmable software synthesis with Max/MSP in 1997. Similar teleologies are still at work in video and TV technology, with the ongoing transitions from SD to HD and to 4K, from DVD to BluRay, 2D to 3D, always sold with the same narrative of innovation, improvement, and cleaner reproduction. Cascone and Andrews simply rejected this. “Post-digital” might have been confusingly named because Cascone’s glitch music actually was digital, even based on digital sound processing artifacts. But it should rather be seen as a reaction to an age where even tripods are being sold with “digital” stickers attached in order to suggest that they are new, improved technology.

"digital" tripod

“digital” tripod

In this sense, “post-digital” reenacted older forms of resistance to formalist, mathematically driven progress narratives in music – namely the opposition to serialist composition in 20th century contemporary music that started with John Cage, continued with the early minimal music of La Monte Young and Terry Riley and did not end with improvisation/composition collectives such as AMM and Cornelius Cardew’s Scratch Orchestra. The serialism of Stockhausen, Boulez and their contemporaries was digital in the most literal sense of the word since it broke down all parameters of musical composition into computable values and applied numerical transformations to them. In the later age of mass consumer media technology, computations shifted from composition to signal processing, and from production to reproduction. (Involving sometimes the same companies, such as Philips – which founded a studio for contemporary electronic music in the 1950s and co-developed the audio CD in the early 1980s.)

Most serialist music, however, was not electronic but composed with pen and paper and performed by orchestras. This highlights a crucial point: unlike in the colloquial usage of the word (but also its common understanding in the arts and humanities), “digital” does not have to involve electronics at all. In this sense, the technical-scientific notion of “digital” can – paradoxically enough – be applied to devices that would be called “post-digital” in the arts and humanities.

What is post-digital then?

(I am trying to reiterate and systematize points I had written down in previous publications.)

Going back to Cascone and Andrews, but also to post-punk, postcolonialism and Mad Max, “post-digital” most simply describes the messy state of media, arts and design after their digitization, or at least after the digitization of vital parts of their communication. While contemporary visual art, for example, only slowly begins to accept net artists as regular contemporary artists (and among them, rather those whose work is gallery-compatible like Cory Arcangel’s), its discourse and networking have already profoundly changed through the e-flux mailing list, art blogs and the electronic e-flux journal. These media that have largely superseded paper art periodicals in circulation, power and influence at least for artists and curators. Likewise, paper newspapers have become post-digital, or post-digitization media where they shift their emphasis from news (for which the Internet is faster) to reportage and commentary.

“Post-digital” thus refers to a state where disruption through digital information technology has already occurred. Which can often mean – as for Cascone – that it is no longer perceived as disruptive. In this sense, the term “post-digital” is positioned against the term “new media”. At the same time, as its negative mirror, it exposes (perhaps even deconstructs) the latter’s hidden teleology: If “post-digital” evokes critical reactions concerning the historico-philosophy embedded into the prefix “post”, then it also the reveals the previous lack of such criticality towards the older term “new media” which is no less Hegelian.

Furthermore, “post-digital” describes a perspective on digital information technology that is no longer focused on technical innovation or improvement, even rejecting innovation narratives. Consequently, the distinction between “old” and “new” media collapses in theory as well as in practice. As Kenneth Goldsmith observes, his students “mix oil paint while Photoshopping and scour flea markets for vintage vinyl while listening to their iPods”[3]. Each medium is chosen for its own particular material aesthetics including artifacts. Lo-fi and misbehavior is embraced, just as in Cascone’s digital glitch and jitter music, but just as much in analog grain, dust, scratches or hiss, as a form of practical exploration and research that understands media from their misbehavior. This approach of using technology against its high end promises boils down to practically the same as to what computer hackers do, namely taking systems apart and using them against their design intentions.

Post-digital risograph printmaking, audio cassette production, mechanical typewriter experimentation or vinyl DJing clearly overlap with hipster retro media trends, including the digital simulation of analog lo-fi in popular smartphone apps such as Instagram. Rediscovery and repurposing of these “vintage” media with a hacker spirit, on the other hand, set it apart from this culture.

Non-digital media technologies such as the ones mentioned above become post-digital when they are not simply nostalgically revived, but functionally repurposed in (often critical) relation to digital media technologies: zines that become anti- or non-blogs, vinyl as anti-CD, cassette tapes as anti-mp3, analog film as anti-video.

On the other hand, ethics and conventions that became mainstream with Internet communities and Open Source/peer-to-peer culture become retroactively applied to the making of non- and post-digital media products, such as in collaborative zine making events (which are in extreme opposition to the hyper-individualist zine cultures of the post-punk 1980s and 1990s). If one maps Lev Manovich’s 2001 taxonomy of “new media” as “Numerical Representation”, “Modularity”, “Automation”, “Variability” and “Transcoding” to a contemporary zine fair or risography community art space, then “modularity”, “variability” and – in a more loosely metaphorical sense – “transcoding” would still apply to the contemporary cultures of working with these “old” media. In other words, “post-digital” can usefully describe “new media”-style approaches to working with (so-called) “old media”.

This hacker and community-oriented approach shifts the previous dichotomies of “old” and “new” media, analog and digital to a new difference of shrink-wrapped versus Do-it-yourself. No medium embodies this better than the magazine and web site Make, published by O’Reilly since 2005 and instrumental in the foundation of the contemporary maker movement. Make covers 3D printing, Arduino hardware hacking, FabLab technology, as well as classical DIY and crafts, and hybrids between them.

Conversely, the 1990s/early 2000s equation that analog mass media (such as newspapers and radio) are corporate and “new media” such as web sites are DIY, is no longer true ever since user-generated content has been co-opted into corporate social media and mobile apps. The Internet as an self-run alternative space – central to many activist and artist’s online projects from The Thing onwards – simply is no longer intuitive for anyone born after 1990 and identifying the Internet with corporate, registration-only services.[4]

The Maker movement, whether in FabLabs or on zine fairs, embodies a shift from the purely symbolic, as privileged in digital systems (for which the login is the perfect example), towards the indexical: from code to traces, and from text to context. While 1980s post-punk zines, for example, resembled manifestos (such as those of the Berlin Dadaists in the 1920s) and 1980s Super 8 films (such as the Cinema of Transgression) created underground narratives against mainstream cinema, the majority of contemporary zines and analog films tend to shift from content to pure materiality where the medium, such as paper or celluloid, indeed is the message; from semantics to pragmatics, and from meaning to being.[5]

When ‘post-digital’ is ‘digital’ and vice versa

From a technological and scientific point of view, the word “digital” is wrongly understood and used by Cascone. That also applies to most of what is commonly labelled “digital art”, “digital media” and “digital humanities”. If something is “digital”, it neither has to be electronic, nor involve binary zeros and ones. It does not even need to be attached to electronic computers or any other kind of computational device.

Conversely, analog does not mean non-computational or pre-computational, since there are also analog computers. (Using water and two measuring cups for computing additions and subtractions – of quantities that can’t be exactly counted – is a simple example for analog computing.) “Digital” simply means that something is divided up into exactly countable units – countable with whatever system one uses, whether zeros and ones, decimal numbers, strokes on a beer mat or the digits of one’s hand. (Which is why “digital” is called “digital”; in French, for example, the word is “numérique”.) Therefore, the Western alphabet is a digital system, the movable types of Gutenberg’s printing press constitute a digital system, the keys of a piano are a digital system, Western musical score notation is digital aside from such non-discrete value instructions as adagio, piano, forte, legato, portamento, tremolo and glissando. Floor mosaics made from monochrome tiles are digitally composed images. These examples show, too, that “digital” never exists in any perfect form but is always is being abstracted and idealized from matter that, by nature and the laws of physics, has chaotic properties and often ambiguous states[6].

“Analog” conversely means that something has not been chopped up into discrete, countable units, but consists of an signal that by itself as no discrete units but is gradually and continuously changing, such as a sound wave, light, a magnetic field such as on an audiotape but also on a computer hard disk, the electrical flows in any circuit including computer chips, a painted color gradient. The fingerboard of a violin is analog, because it is fretless – undivided -, the fingerboard of a guitar is digital, because frets divide it into single notes. What is commonly called “analog” photographic and cine film is actually a hybrid of analog and digital: the particles of the film emulsion are analog, because they are in organic-chaotic order and not reliably countable like pixels -, the single frames of a film strip are digital since they’re discrete, chopped up and unambiguously countable.

This means that media, in the technical sense of storage, transmission, computation and display devices, are always analog: The voltage in a computer chip is analog, and only through filtering, one can determine whether high voltage corresponds to a “zero” and low voltage to “one” (which is why worn/out hardware can make bits flip and turn zeros into ones); the sound waves produced by a sound card and a speaker are analog; etc. An LCD screen is a hybrid digital-analog system because its display has discrete, countable, single pixels, but the light they emit is an analog continuum.

There is hence no such thing as digital media,[7] only digital or digitized information: chopped-up numbers, letters, symbols and whatever other abstracted units as opposed to continuous, wave-like signals such as physical sounds and visuals. Most “digital media” devices are really analog-to-digital-to-analog converters. An mp3 player with a touchscreen interface, for example, takes analog, non-discrete gesture input, translates it into binary control instructions that trigger computational information processing of a digital file, ultimately decoding it into analog electricity that another analog device, the electromagnetic mechanism of a speaker or headphone, turns into analog sound waves. The same principle applies to almost any so-called digital media device, whether it’s a photo or video camera, or a military drone. As soon as something is perceivable (and thus aesthetic), it takes the form of non-discrete waves and therefore is analog.

“Digital art” based on the technical definition of “digital” would, however, likely be called “post-digital” or even “retro analog” by art curators and media studies scholars: stone mosaic floors from Internet image memes, for example, installations with mechanical typewriters[8], countdown loops on a Super 8 or 16mm film projector.

The everyday colloquial meaning of “digital” is metonymical: anything connected to computational electronic devices, even if it’s a tripod. It is a notion fostered and solidified last not least by marketing and product advertising. Some eyebrows should thus be raised when the humanities simply take it over, in the concept of “digital humanities” for example, without any question asked. In that sense, “post-digital” art, design and media works – whether or not they actually should be called post-digital – often make up for lacking critical reflection of digitality.

Conclusion (draft of the draft)

In the year 2000, the notions of “post-digital” proposed by Cascone and Andrews were somewhat contradictory in that they simultaneously rejected the rhetoric of “new media” while heavily relying on it. Cascone’s paper drew on a “Wired” column of Nicholas Negroponte, Ian Andrews’ paper on Lev Manovich’s “Generation Flash” which advocated the very opposite of the the analog/digital, retro/contemporary hybridizations that are associated with the term “post-digital” today. If post-digital aesthetics consists, metaphorically speaking, of postcolonial practices in a communications world taken over by a industrial-military complex of a handful of global players, then it is perhaps easiest to think of it as the opposite of (Ray Kurzweil’s and Google’s) Singularity movement, the Quantified Self movement and other forms of techno gnosis.

Nevertheless, it is often driven by structurally similar fictions of agency:[9] the fiction of agency over one’s body in the Quantified Self movement, the fiction of the self-made in the DIY and Maker movements, the fiction of more direct engagement with media in handmade film labs. (Sociologically, both cybergnostic and post-digital cultures might be seen as either over-affirmation of or scepticism towards system complexity, and desires of agency. – to be elaborated)


  1. Kim Cascone, The Aesthetics of Failure, in: Computer Music Journal, vol. 24 issue 4, December 2000, 12–18  ↩

  2. Ian Andrews, Post-digital Aesthetics and the return to Modernism, 2000 (accessed 9–2013)  ↩

  3. Kenneth Goldsmith, Uncreative Writing, Columbia University Press, 2011, 226  ↩

  4. In a project on Open Source culture with Bachelor students from the Willem de Kooning Academy Rotterdam, it turned out that a number of students believed that web site user account registration was a general feature and requirement of the Internet.  ↩

  5. It’s debatable to which degree this reflects the influence of non-Western, particularly Japanese (popular) culture on contemporary Western visual culture, particularly in illustration (which amounts to a large share of contemporary zine making). This influence even more clearly exists in digital meme and imageboard culture.  ↩

  6. This is what Friedrich Kittler meant in his opaquely written essay “There is no Software”, in: Stanford Literature Review", vol. 9, 1992 (1991),81–90.  ↩

  7. Even the piano, if considered a medium, is digital only to the degree that its keys implement abstractions of its analog-continuous strings.  ↩

  8. Such as Linda Hilfling’s contribution to the exhibition MAKEDO at V2_, Rotterdam, 29–30 june 2007.  ↩

  9. This is how Aldje van Meer, coordinator of CrossLab at Willem de Kooning Academy, interprets art students’ preference for working non-electronically and “rather make a poster than a website”.  ↩