Archive

Archive for the ‘software studies’ Category

Out of our heads, in our media

January 30, 2012 1 comment

I am off to Berlin Transmediale 2012-festival soon — excited as always. Giving there two talks; the latter one on Sunday on a panel organized by Tim Druckrey on methods for media art histories — I guess an unofficial media archaeology panel with Siegfried Zielinski, Wolfgang Ernst and Inke Arns!

And something different already on Friday; a performance with Julio D’Escrivan, mixing media theory and live coding… part of the Uncorporated Subversion-panel! Here is a short summary, but for the whole effect…be there on Friday. Should be worth the while, I promise…even though, in terms of the theme of “cognitive capitalism” that the paper touches a bit; I am not at all uncritical towards it, and agree it misses several key points. However, the notion is good, I would say, as a way to continue such investigations as Jonathan Crary has started into the ways in which cognition in a very wide sense, embrasing embodied, affective being, perception, sensation, is constantly articulated “out of our heads” (Alva Noë) — but in our media; a media ecology of production of perceptive, thinking, remembering subject. The collaborative form between me and D’Escrivan has itself been again a great way to work together. Last year we tried out similar things with Garnet Hertz (also with our Transmediale theory prize nominated paper on Zombie Media), and now through the performance.

This presentation can be best approached as an experiment in theory-code-collaboration, through live coding (D’Escrivan) and some speculative media theory (Parikka) concerning techniques of the cognitive. With some minor notes that reflect  live coding as a practice, the focus is more or less on the notion of cognitive capitalism. What the talk/performance presents are some tentative steps towards a media archaeology of cognitive capitalism. In other words, what are the supportive, sustaining and conditioning techniques that contribute to the cerebral? In this context, we propose to step away from a cognitive as understood as immaterial or as inner life, and towards a cognitive that is distributed, supported, relayed and modulated continuously in a complex information ecology. We are interested in investigating forms of collaboration between code and sonic arts, and media theory, and investigate collaboration as a form of (extra-)institutional practice in contemporary arts and education field.

 

Dirty Matter

August 26, 2011 Leave a comment

I was asked to write a short forum piece on “new materialism” for Communication and Critical/Cultural Studies-journal and I wrote a piece called “New Materialism as Media Theory: Dirty Matter and Medianatures”. It partly picks up on some of the themes I have been recently talking and writing about, influenced by such scholars as Sean Cubitt. It also articulated – albeit briefly – some points concerning German media theory as new materialism, even if going the quickly to a different direction concerning materiality. Here is a short taster of what’s to come.

The key points of the text were in short: 1) we need to understand how media technologies themselves already incorporate and suggest “new materialism” of non-solids, non-objects and this is part of technical modernity (the age of Hertzian vibrations); 2) we need also to understand bad matter – not just the new materialism that is empowering, but one that is depowering: the matter that is toxic, leaking from abandoned electronic media, attaching to internal organs, skins of low paid workers in developing countries. In this context, “medianatures” is the term I use to theoretically track the continuums from matter to media, and from media back to (waste) matter.

I believe that it is this continuum that is crucial in terms of a developed material understanding of media cultures. Hence, it’s a shame from a new materialist point of view that even such pioneering thinkers as Michel Serres miss this point concerning the weird materialities of contemporary technological culture – weird in the sense that they remain irreducible to either their “hard” contexts and pollution (CO2, toxic materials, minerals, and other component parts) or their “soft” bits – signs, meanings, attractions, desires. In Malfeasance. Appropriation Through Pollution? Trans. Anne-Marie Feenberg-Dibon (Stanford: Stanford University Press 2011),  these are the two levels Serres proposes as crucial from an environmental point of view but he ignores the continuum between the two. And yet, signs are transmitted as signals, through cables, in hardware, in a mesh of various components from heavy metals to PVC coatings.

Perhaps a good alternative perspective to Serres’ is found in how both Félix Guattari and Gilles Deleuze conceive of a-signification as a regime of signs beyond signification and meaning: Gary Genosko’s apt example (in: Félix Guattari. A Critical Introduction London: Pluto 2009, 95-99 ) is the case of magnetic stripes on for instance your bank card as a form of automatized and operationalized local power that is not about interpretation, but a different set of signal work. Elaborating signaletic material – electronic signals and software – through a reference to Deleuze’s film theory and a-signification by Bodil Marie Stavning Thomsen is also useful. As she elaborates – and this much we know from years of intensive reading of Deleuze in screen based analyses – Deleuze wanted to include much more than signification into the cinematic impact, and mapped a whole field of a-signifying matter in film: “sensory (visual and sound), kinetic, intensive, affective, rhythmic, tonal, and even verbal (oral and written).” (“The Haptic Interface. On Signal Transmissions and Events” in Interface Criticism. Aesthetics Beyond Buttons, edited by Christian Ulrik Andersen and Søren Bro Pold, Aarhus University Press 2011, 59) What she points out in terms of signal media is as important: after signs come signals, and the media of signals needs a similar move as Deleuze did with film: to carve out the a-signifying material components for digital media too.

Such a-signifying components are rarely content to stay on one level, despite a lot of theory often placing primacy to software, hardware, or some other level. Various levels feed into each other; this relates to what Guattari calls mixed semiotics, and we can here employ the idea of a medianature-continuum.  The a-signifying level of signs is embedded in the a-signifying materiality of processes and components.

In short, it’s continuums all the way down (and up again), soft to hard, hardware to signs. In software studies (see: David M. Berry, The Philosophy of Software. Code and Mediation in the Digital Age,  Palgrave Macmillan 2011, 95-96), the continuum from the symbol functions on higher levels of coding practices to voltage differences as a “lower hardware level” has been recognized: assembly language needs to be compiled, binary is what the computer “reads”, and yet such binaries take effect only through transistors; and if we really want to be hardcore, we just insist that in the end, it comes back to voltage differences (Kittler’s famous “There is no Software”-text and argument). Such is the methodology of “descent” that Foucault introduced as genealogy, but German media theory takes as a call to open up the machine physically and methodologically to its physics – and which leads into a range of artistic methodologies too, from computer forensics to data carvery. In other words, recognizing the way abstraction works in technical media from voltages and components to the more symbolic levels allows us to track back, as well, from the world of meanings and symbols – but also a-signification – to level of dirty matter.

 

Humm your code, “dah-dit-dah-dit”

Media amateurism has been an integral part of modern culture way before social media kicked in with its own DIY spirit. The electronic hobbyists and tinkerers of 1970s were themselves too preceded by so many earlier forms of learning communication and building circuits. Code-based culture does not then begin with software as we know it – from the emergence of computing and the much later emergence of computer languages as separate entities that relate to the mythologies of coders, hackers and controlling the hardware through the magical language of code (for a wonderful recent excavation into ontologies of software, see Wendy Chun’s Programmed Visions).

“Thousands of Radio Amateurs find it easy to Learn Code”, read a main title in Popular Science (March 1932), describing the process of getting a radio amateur license and the earlier technological discourse concerning machine-knowledge. Radio amateurism and wireless DIY of the earlier decades of 20th century represents itself perhaps one of the most important media archaeological reference points when thinking about contemporary technological DIY culture, and one can find interesting ideas from that discourse.  The way knowledge about machines, code and the professionalism is standardized and practice is itself fascinating – DIY as a crash course into key scientific discoveries of modernity, practically applied. Electrical functions needed to be internalized into a hands-on skill, as the article describes: “You must first master the elementary principles of electricity as given in the simpler textbooks on the subject. Then you must apply the principles of magnetism and electromagnetic action plus an understanding of the radio vacuum tube to mastering simple radio transmitting and receiving circuits. […] You don’t have to know all the ins and outs of complicated radio broadcast transmitting circuits, nor do you require a detailed knowledge of elaborate receiving circuits such as the heterodyne.” (72)

This class of amateurs was however someone who was part of a nationally regulated standardization process flagging the importance of this system of transmission – this regulation had to do with technical knowledge, ethics and legalities as well as speed of communication, or skills more widely: The amateur operation license test was the way to become an operator – the mythical figure still living in such discourses as The Matrix-film(s), the one in charge of the communication field – what message goes where, interpreting of code, sending of things, packets, people to addresses.

But it was grey, this area of knowledge – or at least reading through the regulations. Take paragraph 9 of the Radio Division Regulations for Operators: “Amateur Class. Applications for this class of license must pass a code test in transmission and reception at a speed of at least 10 words per minute, in Continental Morse Code (5 characters to the word). An applicant must pass an examination which will develop knowledge of the adjustment and operation of the apparatus which he desires to use and of the international regulations and acts of Congress in so far as they relate to interference with other radio communications and impose duties on all classes of operators.” Speed – speeding up of communication as part of modernity – was something that was still tied to the skills of the operators, and slowed down by the human needing to be trained.

Code, as indicated in the passage, meant of course Morse Code. Dit-dit-dit-dah. A tip given in Popular Science relates to a sensory approach to code as not only abstract pattern but something that relates to your ears and mouths: “In memorizing the code, try to think of the letters as different sounds rather than as so many dots or dashes. Think of the letter C, for example, as “dah-dit-dah-dit” and as dash followed by dot, followed by dash, followed by dot.” (73) Carnal knowledge?  Code in the flesh sounds much too poetic, but at least we could say, code in your mouth, ringing in your ears, feedback to your fingers tapping. Code, signal processing, transmission share so much with cultures of music, rhythmics, sound and voice.

(Forthcoming and related: a podcast interview with Paul Demarinis about hands-on, carnal knowing of technical media and media archaeological art.)

Do Some Evil

June 14, 2011 2 comments

It’s the opposite to “do no evil”, a call to think through the dirty materiality of media. Trick, deceive, bypass, exploit, short-circuit, and stay inattentive.

Hence, it is not only about “evil objects” as I perhaps myself have focused on (in Digital Contagions, and in other places), even if such objects can be vectors for and emblematic of stratagems of evil media. Evil media studies focuses on strategies that are mobilized as practices of theories. These strategies reach across institutions, and hence it is no wonder that Geert Lovink recently flags this as one approach through which to energize media studies.

Or more formally – Evil Media Studies “is a manner of working with a set of informal practices and bodies of knowledge, characterized as stratagems, which pervade contemporary networked media and which straddle the distinction between the work of theory and of practice”, write Andrew Goffey and Matthew Fuller in the chapter by the same name in The Spam Book.

For me, the attraction in Goffey and Fuller’s call is that it is material – material that is dynamic, non-representational, machinating and filled with energies that flow across software, social and aesthetic.

  1. Bypass Representation
  2. Exploit Anachronisms
  3. Stimulate Malignancy
  4. Machine the Commonplace
  5. Make the Accidental the Essential
  6. Recurse Stratagems
  7. The Rapture of Capture
  8. Sophisticating Machinery
  9. What is Good for Natural Language is Good for Formal Language
  10. Know your Data
  11. Liberate Determinism
  12. Inattention Economy
  13. Brains Beyond Language
  14. Keep Your Stratagem Secret As Long as Possible
  15. Take Care of the Symbols, The Sense Will Follow
  16. The Creativity of Matter

(the list from “Evil Media Studies” by Goffey and Fuller, in The Spam Book: On Porn, Viruses and Other Anomalous Objects From the Dark Side of Digital Culture, eds. Parikka & Sampson, Hampton Press 2009).

Platform Politics-conference: Opening Words

May 15, 2011 2 comments

We organized a very successful Platform Politics-conference in Cambridge, May 11-13, where our speakers included such exciting scholars and writers as Michel Bauwens, Michael Goddard, Tiziana Terranova, Nick Couldry, Nick Dyer-Witheford, Felix Stalder, and Tim Jordan.

These are my short opening words to the event:
Platform Politics takes place as part of the Arts and Humanities Research Council founded networking project on Network Politics. We pitched it to the AHRC with the suggestion that it takes one to know one: to understand emerging forms of social action and politics on networks and in network society, one has to develop networks, to crowdsource ideas from leading scholars, activists, artists; to map and to bring through various channels such partipants together in order to identity themes and directions which need more focus. Hence, we have arrived at the third and final event of the project – now on Platform Politics, following the first event in Cambridge on Methodologies of Network Politics research, and last year in Toronto at the Infoscape research lab with help from Greg Elmer, Ganaele Langlois and Alessandra Renzi we discussed object oriented  and affect approaches to network politics.

We wanted to keep the notion of platform quite broad in order to solicit a more open range of papers. Hence, from software platforms such as Twitter and Facebook to robotics, from theoretical insights that draw from post-Fordist theories of politics and labour to object oriented philosophy, and much more, we have a privileged position to think through the platform (often seen as technological, as in platform studies) as a platform for our investigations (hence, also as a conceptual affordance). This does not mean to say that platform studies as represented in the Bogost and Montfort led series is techno-determinist, and solely focuses on such – quite the contrary, it tries to find a specific relation between technology and aesthetics. Yet, it is good to emphasize the mobilization of the concept as part of various transitions, and translations: platforms in technological (and again there various levels from apps to clouds, online platforms to technological hardware structures), conceptual, economic and of course political sense (expect at least a couple of references to Lenin in this conference).

So if Bogost and Montfort make sense of platform studies through this kind of layering:

Bogost and Montfort: Platform Studies

I would add that the notion of platform politics is able to articulate various levels together, and bring smoothness and movement to the interaction of the layers. In other words, in addition to the specific level of “platforms” we can think of the platform itself as distributed on a variety of layers as assemblages (in the manner Manuel Delanda uses the term?). A good example of this – something we were unable to pursue because of the problem of finding the slot for it! – was the idea of organizing a circuit bending/hardware hacking workshop (with Seb Franklin). The idea was to follow Garnet Hertz’s lead, and the way he has organized such workshops both to kids as well as to media theorists — and to use hands-on tinkering, opening up technology such as battery-operated toys, as a way to think through hardware platforms, how design solutions incorporate politics, how they afford conceptual approaches, and act as one node across a variety of other platforms. (An example of such is articulated in the forthcoming “Zombie Media: Circuit Bending Media Archaeology into an Art Method”-text, by myself and Hertz, in Leonardo-journal where we tie the question of such design politics of hardware to media archaeology, art methods and the political economy of network culture).

Platforms reveal to be ontogenetic – i.e. creative forms of interaction, not just stable backgrounds for a continuation of the social. They organize social action in a double bind where social action organizes them. Platforms rearticulate the social. For instance software platforms constitute a catalyzer for specific social forms, and as such incorporate in themselves a multitude of social, political, economic forces. It is a question of production – and of what kinds of social relations are being produced, a good example being the Telecommunist project and the Thimbl open source distributed microblogging service that incorporates a different sociability than proprietorial web 2.0 business and software models. And yet, this sociability is grounded on the level of the potentials of networks, the P2P instead of Web 2.0, distributed instead of the centralized client-server-model.

At the beginning of the project, we started with the question of “”what is network politics?”” and requested initial position papers from some key writers in the field – today of those we have Tiziana Terranova and Greg Elmer attending. Other theorists included Alex Galloway, Eugene Thacker, Katrien Jacobs and Geert Lovink.

The idea was to organize this as a form of request for comments – the RFC format, familiar from internet design culture, of questioning, lining up comments and positions, which however did not pan out as extensively as we wanted (this has to do with other organizationally interesting themes concerning spam management in participatory platforms, and so forth). However, what we got from the position papers were some initial leads. Furthermore,  we started with some assumptions where to start tracking network politics:

–       politics of new network clusters, services, platforms – Twitter, Facebook, as well as mapping alternative forms of  software-based ways of organizing traditional political parties as well as new formations, NGOs, and temporally very different groupings/phenomena – whether the suddenly emerging and as suddenly disappearing “like” protests for instance on Facebook, or the more long-term effects of Wikileaks– leak not only in the meaning of leaking secret information, but leaking across media platforms, and reaching a long term sustainability through “old” media trying to come grips with such online activities.

–       biopolitics of network culture, or in other words, the various practices which form internet cultures – hence a step outside of the technological focus, to look at what practices define network politics, and as such the links between work and free time, of play and labour, the circulation of affects, sociability, and so forth. Cognitive capitalism but as much affective capitalism. Yesterday (referring to the pre-conference event with Michel Bauwens and Michael Goddard) we got a bit into talking about investments of affect, desire and such topics.

–       we were interested too in the metaquestion: what form would investigating network politics have to take? Outside the normal practice of humanities, writing and meeting up in conferences, what are the specific pedagogic and research tools/platforms that are actively changing the politics of education and research inside/outside academia. What are the research/creation platforms that are able to articulate this, so that we are not only stuck with “master’s tools”?

–       And in a way, as a more unspecified but as important was the question of politics of the imperceptible: what kinds of forms of politics there are out there that are not even recognized as politics? From artistic practices to the grey work of engineers, new arenas of expertise, skill and again, social action contribute to the way in which politics is fleeing from traditional institutions.

The project has been able to map various positions to such questions, and raise new ones – which has been the purpose of this all: to produce more leads for further work. The same thing applies to this conference, and we are hoping to come out with excellent contributions, that do not fall within such original ideas.  During the project’s unfolding, “network politics” became a wider popular media phenomenon too, where old media started to focus on what is was able to brand as “twitter-revolutions, or facebook-revolutions” – and yet this only emphasized the need to complexify the notions, and the histories of such events and platforms, as has been done on various email lists, and various other debate forums already. I am sure we can continue on that, and produce some really exciting discussions – and as always in our events, we really hope that a lot of the emphasis is on discussions in the sessions, as well as outside them.

>Wirelessness – radical empiricism in media theory

January 9, 2011 3 comments


Adrian Mackenzie captures something extremely essential and apt in his fresh book Wirelessness – Radical Empiricism in Network Cultures (2010). Besides being an analysis of an aspect of contemporary “network” culture so neglected by cultural analysers, it offers a view into how does one conduct post-phenomenological analysis into the intensive, moving, profiliterating aspects of experience in current media culture. So much of what seems wired is actually wireless; so much of what seems experienced, is actually at the fringes of solid experience, which is why Mackenzie sets out to use William James’s exciting philosophical theories of radical empiricism as his guide to understanding wirelessness.

Let’s define it, or let Mackenzie define it:

“The key claim of the book is that the contemporary proliferation of wireless devices and modes of network connection can best be screened against the backdrop of a broadly diverting and converging set of tendencies that I call ‘wirelessness’. Wireless designates an experience trending toward entanglements with things, objects, gadgets, infrastructures, and services, and imbued with indistinct sensations and practices of network-associated changed. Wirelessness affects how people arrive, depart, and inhabit places, how they relate to others, and indeed how they embody change.” (5)

Indeed, Mackenzie does not remain content to just stick to the techy details or the phenomenology of how it feels to be surrounded by wireless devices and discourses, but sets out to treat these as a continuum. This too follows from James. Things go together well with our minds/brains. Thoughts are very much things even if at the other end of the spectrum than the more seemingly solid things of the world. Thinking and things cannot be separated. Mackenzie quotes James: “Thoughts in the concrete are made of the same stuff as things are.” The stuff of continuum.

Hence, what follows is also methodologically exemplary treatment of this weird phenomena of wireless communication. Already in its early phase, the fact that communication started to remove itself from solid bodies and the messaging human body, was a topic of awe and wonderment. James was roughly a contemporary to the buzzing discourses of electromagnetic fields and experiments in wireless communication closer to the end of the 19th century by such figures as Preece, Willoughby Smith and of course Marconi; this media archaeological aspect is not so much touched upon by Mackenzie. In any case, one would do well to look at it’s 19th century radical empiricist discourses as well, to examine the way bodies, solids, experience and media were being rethought in those early faces, here described in the words of one pioneer and early writer Sir William Crookes:

” Rays of light will not pierce through a wall, nor, as we know only too well, through London fog; but electrical vibrations of a yard or more in wave-length will easily pierce such media, which to them will be transparent.” (quoted in J.J.Fahie, Wireless Telegraphy, 1838-1899, p.197).

Even if not transparency, wirelessness affords new senses of mobility. For us, wireless is heavily an urban phenomena (even if touches on how rural areas are being connected, peripheries harnessed, and now, also, the human body and its organs individually connected to the internet with new wireless device surgery). For Mackenzie, the mobility relates to “transitions between places” and how such hotspotting of for example the urban sphere creates new forms of intensity that are not stable. In his earlier book Transductions Mackenzie was using Simondon’s vocabulary which offered the idea of the primacy of metastability, now James is doing the same trick with offering a conceptual vocabulary for an experience that is distributed, diffuse and coming and going.

What is fascinating is how Mackenzie moves between the various scales, and still is able to keep his methodology and writing intact. In addition the fact that the urban experiences of humans is being enabled by the variety of wireles devices, networks, accesses, and so forth, he is after such radical technological experience where hardware and software relations within technology matter as well. Talking about chipsets such as the Picochip202, Mackenzie compares these to cities: “The ‘architectures’ of chipsets resemble cities viewed from above precisely because they internalize many of the relational processes of movement in cities.” (65).

The way bodies were moved and managed in urban environments has now been transposed as a problem on the level of chips and other seemingly “only” technical solutions. Yet, what Mackenzie does succesfully is to show how we need insights into such biopolitics that engage not only with human phenomenological bodies, but biopolitics of technological bodies too. This is what I find a very exciting and necessary direction, and while I know some of the great work done in Science and Technology studies, more media studies work in this direction of new materialism is very much welcome.

So now that we got talking about technological bodies in relation, and probably going soon so far that we could say that they have affects, would some critic say, does this not mean that we losing our grip on politics — that technology is such a crucial way of governing our worlds, offering meanings, and is itself embedded in a cultural field of representation and such?

Mackenzie does not however neglect representations, or the variety of materials of which the experience of wirelessness consists; from wireless routers to marketing discourses and adverts, the ontological claim that thinking and things do not differ work also as a methodological guideline for rigorous eclectism. Similarly, Mackenzie shows how his methodology and writing lends itself also to postcolonial theory in chapter 7 “Overconnected worlds”. Here, the claim is consist with a radical constructedness inherent in how transnationality and the global are created, not received, structures of experiencing; here, various wireless projects offer such platforms for both belief as well as physical connection.

Wirelessness overflows individual bodies and act as a catalyzer, intensifier, a field for experience perhaps in the sense as electromagnetic fields afford the technical signal between devices. What the book does as well is overflows in its richness – but it is clear that it is so rigorous in its take that media theory benefits from this for a long time. It picks up on some of the same inspiration that has been catalyzed into more philosophical takes on communication and contemporary culture by Brian Massumi, but is one of the first ones to take this mode of analysis of lived abstractions into concrete media analysis – similarly as he did with Simondon already in Transductions.

>New Materialism abstracts

>

For the forthcoming 21st June event New Materialisms and Digital Culture, here are the abstracts which promise very interesting crossdisciplinary perspectives into investigating what is new materialism in the context of various practices and arts of digital culture.


David M. Berry: Software Avidities: Latour and the Materialities of Code
The first difficultly in understanding software is located within the notion of software/code itself and its perceived immateriality. Here it is useful to draw an analytical distinction between ‘code’ and ‘software’. Throughout this paper I shall be using code to refer to the textual and social practices of source code writing, testing and distribution. In contrast, I would like to use ‘software’ to include products, such as operating systems, applications or fixed products of code such as Photoshop, Word and Excel and the cultural practices that surround the use of it. This further allows us to think about the hacking as the transformation of software back into code for the purposes of changing its normal execution or subverting its intended (prescribed) functions. However, this difficulty should not mean that we stay at the level of the screen, so-called screen essentialism, nor at the level of information theory, where the analysis focuses on the way information is moved between different points disembedded from its material carrier, nor indeed at the level of a mere literary reading of the textual form of code. Rather code needs to be approached in its multiplicity, that is as a literature, a mechanism, a spatial form (organisation), and as a repository of social norms, values, patterns and processes. In order to focus on the element of materiality I want to use Latour’s notion of the ‘test of strength’ to see how the materiality of code, its obduracy and its concreteness are tested within computer programming contests. To do this I want to look at two case studies: (1) the Underhanded C Contest, which is a contest which asks the programmer to write code that is as readable, clear, innocent and straightforward as possible, and yet it must fail to perform at its apparent function. To be more specific, it should do something subtly evil; and (2) The International Obfuscated C Code Contest, which is a contest to write the most Obscure/Obfuscated C program possible that is as difficult to understand and follow (through the source code) as possible. By following the rules of the contest, and by pitting each program, which must be made available to compile and execute by the judges (as well as the other competitors and the wider public by open sourcing the code), the code is then shown to be material providing it passes these tests of strength.


Rick Dolphijn: The Intense Exterior of Another Geometry
Starting with several examples from contemporary ‘animal architecture’, this paper proposes a search for how anything ‘surrounding’ the organic body (a box, a piece of cloth, a house), in the alliance it creates with this body, is mutually united with it. It brings us to the practices central to this paper as they concern envisioning our “urban exoskeleton” as DeLanda calls it, and how this sets forth the emergence of a “future people” as Proust already foresaw it. In other words, our interests lie with how life comes into being in its intense relationships with urban morphology. We then needs to accept the definition of life offered to us by Christopher Alexander who considers life “a most general system of mathematical structures that arises because of the nature of space” (2004: 28). To speculate the future lives (unconsciously) hidden in the morphogenetic qualities of urban form today, should then be pursued in terms of the (aesthetic) principles of creating space. Conceptualizing these principles in the Occident and in the Orient, we allow ourselves to conceptualize a difference between two wholly other urban bodies of which especially the latter (the Oriental) has hardly received any attention in contemporary theory. This Oriental ‘city of axonometric vision’, as we develop this next to the (Occidental) ‘city of linear perspective’ allows us to think the urban exoskeleton in terms of a multiplicity of dynamic surfaces (as opposed to a centralized pattern), through an “equal-angle see-through” (dengjiao toushi in Chinese) (as opposed to a linear perspective) and through a non-dualist felt-togetherness. It allows us to think the creative dynamics of unlimited growth as the new proposition of what the bodies can do.

Eleni Ikoniadou: Transversal digitality and the relational dynamics of a new materialism
The relationship between digital technology and matter has preoccupied media and cultural theorists for the last two decades. During the 90s it was articulated through a celebration of the disembodied, immaterial and probabilistic properties of information (cybercultural theory). More recently, it has been asserted through a reliance on sensory perception for the construction of a predominantly observable, otherwise void, digital space (digital philosophy). However, alternative materialist accounts may be able to offer more dynamic ways of understanding the heterogeneity, materiality and novelty of digital culture (Kittler, 1999; Mackenzie, 2002; Fuller, 2005; Munster, 2006). Following on their footsteps, this presentation will aim to rethink the ontological status of the digital as immanent to the flows of a ‘new materialism’. The latter is understood as a transversal process that cuts across seemingly distinguished fields and disciplines, such as the arts and sciences, establishing new connections between them. New materialism, then, becomes a concept and a method proper for investigating digital media and their tendency to bring together different aspects of the world in new ways. The paper discusses how an abstract materialist new media theory can enable transversal relations between science studies, philosophy and media art, as well as between the actual and the virtual dimensions of reality; allowing the emergence of heterogeneous digital assemblages of material, aesthetic and scientific combination.

Adrian Mackenzie: Believing in and desiring data: ‘R’ as the next ‘big thing’
How could materialist analysis come to grips with the seeming immateriality of data network media? This paper attempts to think through some of the many flows of desire and belief concerning data. In the so-called ‘data deluges’ generated by the searches, queries, captures, links and aggregates of network media, key points of friction occur around sharing and pattern perception. I focus on how sharing and pattern perception fare in the case of the scripting language R, an open source statistical ‘data-intensive’ programming language heavily used across the sciences (including social sciences), in public and private settings, from CERN to Wall Street and the Googleplex. R, it is said, is a ‘next big thing’ in the world of analytics and data mining, with thousands of packages and visualizations, hundreds of books and publications (including its own journal, /R Journal/) appearing in the last few years. In this activity, we can discern vectors of belief and desire concerning data. The tools and techniques developed in R can be seen both intensifying data, and at times, making the contingencies of data more palpable.

Stamatia Portanova: The materiality of the abstract (or how movement-objects ‘thrill’ the world)

Gilles Deleuze and Alfred N. Whitehead have defined the ‘virtual’ not as an unreal simulation but as a real potential, an idea (respectively conceived by them as a ‘mathematical differential’ or a ‘mathematical relation’) around which an actual fact takes shape. Drawing on Deleuze and Whitehead’s concepts of ‘virtuality’, this paper addresses the possibility of a materialist approach that is able to take into account the virtuality of matter, i.e. how the abstract dimension of ideas (‘the mind’, ‘thought’) possesses its own consistence. The concrete object analyzed to exemplify this approach is the relation between digital culture, digital technology and movement, from which something like ‘virtual movement-objects’ emerge. More specifically, the paper explores the use of several technologies of movement creation and distribution (Motion Capture, digital video editing, the Internet) in mass-media environments such as pop music clips and You Tube amateur videos, dance video games and choreography web sites. The main objective is to understand how these applications generate and replicate what will be defined as ‘virtual movement-objects’, digitally generated dance steps that are widely imitated and adapted. From an ‘abstractly materialist’ point of view, the numerical data produced through the digitalization of dance will be considered as virtual movement ideas with a potential to be repeatedly actualized (in videos, live events, games). These ideas have the possibility of infinite reanimation: the same step can be endlessly repeated, becoming a dance of graphic shapes or 3D images, but also a movement across people and cultures. This definition also draws on Gabriel Tarde and Bruno Latour’s understanding of imitation. Imitation, in Latour’s words, weaves a sort of contagious ‘behavioural network’ based on the return of ‘virtual centres of gravity’, ideal patterns attracting a repetition of movements that ‘look the same’ but are always different and unpredictable. This paper therefore explores how, despite their designed nature, movement-objects appear as open and creative movement ideas able to autonomously circulate in transversal social networks and generate unexpected rhythmic behaviours. The diffusion of Michael Jackson’s Thriller dance on YouTube, in Sims animations or in the choreographed performance of 1500 detainees of the Cebu Provincial Detention and Rehabilitation Center (Philippines), can e considered as one of the most famous examples of how dance steps have become virtual movement-objects to be infinitely actualized.

Anna Powell: Affections in their pure state? The digital event as immersive encounter

Digital video offers a distinctively immersive encounter. In its early analogue days, video art seemed to validate Deleuze’s diagnosis of ‘electronic automatism’ (Cinema 2, 1985). Its characteristics include ‘omnidirectional space’, framing which is ‘reversible and non–superimposable’ and the unpredictable motion of ‘perpetual reorganisation’. Spatial composition becomes an opaque ‘table of information’ on which data ‘replaces nature’. Some of Deleuze’s anxieties for the (then new) medium have been fulfilled by surveillance and the mainstream spectacle of GGI, as in the ‘gigantism’ of Avatar’s 3D optical illusionism. Yet, this ‘original regime of images and signs’ has also proved its credentials for the schizo will to art.One obvious formal distinction between cine and digital video is editing. Video editing does not operate by cutting and splicing footage but by ‘dragging and dropping’ sections of film on top of each other. Rather than being excised by cuts to produce temporal elision, uploaded video clips are pulled down on top of a ‘master’. An editing decision can be reversed by using a sliding tool to reveal that the first layer of images is only temporarily overlaid by another. Digital editing thus increases the density and depth of the plane of images by potentially limitless conjunctive synthesis.Deleuze argues that without a sense of the out-of-frame, time and space are overwhelmingly immanent in electronic automatism. This apparent removal of the out-of-frame and the elsewhere leads instead to an intensive meld of brain and screen that can move the mind/screen in schizoanalytic directions. Video art’s preference for gallery installation or live performance with VJ-ing and music rather than cinema screen offers further haptic immersion in the medium.Digital videos that repudiate both the televisual and the cinematic regimes can express what video artist Mattia Casalegno calls ‘affections in their pure state’. The aesthetic properties of digital video bring affect, perception and time closer together. What are the implications of this apparent removal of the gap between actual and virtual? If, as Deleuze suggests, the brain is the screen, what kind of schizo images and thoughts might future digital art unfold? Starting from the overt distinctions of cine and video this paper investigates the impact of the digital body without organs. It references work by video artists specifically Deleuzian inspiration whose works express new materialist intent.

Iris Van der Tuin: A Different Starting Point, A Different Metaphysics: Reading Bergson and Barad Diffractively
This paper provides an affirmative feminist reading of the philosophy of Henri Bergson by reading it through the work of Karen Barad. Adopting such a diffractive reading strategy enables feminist philosophy to move beyond discarding Bergson for his apparent phallocentrism. Feminist philosophy finds itself double bound when it critiques a philosophy for being phallocentric, since the set-up of a Master narrative comes into being with the critique. By negating a gender-blind or sexist philosophy, feminist philosophy only gets to reaffirm its parameters and setting up a Master narrative costs feminist philosophy its feminism. I thus propose and practice the need for a different methodological starting point, one that capitalizes on “diffraction.” This paper experiments with the affirmative phase in feminist philosophy prophesied by Elizabeth Grosz, among others. Working along the lines of the diffractive method, the paper at the same time proposes a new reading of Bergson (as well as Barad), a new, different metaphysics indeed, which can be specified as onto-epistemological or “new materialist.”

Affect, software, net art (or what can a digital body of code do-redux)

After visiting the Manchester University hosted Affective Fabrics of Digital Cultures-conference I thought for a fleeting second to have discovered affects; its the headache that you get from too much wine, and the ensuing emotional states inside you trying to gather your thoughts. I discovered soon that this is a very reductive account, of course — and in a true Deleuzian spirit was not ready to reduce affect into such emotional responses. Although, to be fair, hangover is a true state of affect – far from emotion — in its uncontrollability, deep embodiment.

What the conference did offer in addition to good social fun was a range of presentations on the topic that is defined in so many differing ways; whether in terms of conflation it with “emotions” and “feelings”, or then trying to carve out the level of affect as a pre-conscious one; from a wide range of topics on affective labour (Melissa Gregg did a keynote on white collar work) to aesthetic capitalism (Patricia Clough for example) which in a more Deleuzian spirit insisted on the non-representational. (If the occasional, affective reader is interested in a short but well summarizing account of differing notions of affect to guide his/her feelings about the topic, have a look at Andrew Murphie’s fine blog posting here – good theory topped up with a cute kitty.)

My take was to emphasise the non-organic affects inherent in technology — more specifically software, which I read through a Spinozian-Uexkullian lense as a forcefield of relationality. Drawing on for example Casey Alt’s forthcoming chapter in Media Archaeologies (coming out later this year/early next year), I concluded with object-oriented programming as a good example of how affects can be read to be part of software as well so that the technical specificity of our software embedded culture reaches out to other levels. Affects are not states of things, but the modes in which things reach out to each other — and are defined by those reachings out, i.e. relations. I was specifically amused that I could throw in a one-liner of “not really being interested in humans anyway” — even better would have been “I don’t get humans or emotions”, but I shall leave that for another public talk. “I don’t do emotions” is another of my favourite one’s, that will end up on either a t-shirt or an academic paper.

The presentation was a modified version from a chapter that is just out in Simon O’Sullivan and Stephen Zepke’s Deleuze and Contemporary Art-book even if in that chapter, the focus is more on net and software art. I am going to give the same paper in the Amsterdam Deleuze-conference, but as a teaser to the actual written chapter, here is the beginning of that text from the book…

1. Art of the Imperceptible

In a Deleuze-Guattarian sense, we can appreciate the idea of software art as the art of the imperceptible. Instead of representational visual identities, a politics of the art of the imperceptible can be elaborated in terms of affects, sensations, relations and forces (see Grosz). Such notions are primarily non-human and exceed the modes of organisation and recognition of the human being, whilst addressing themselves to the element of becoming within the latter. Such notions, which involve both the incorporeal (the ephemeral nature of the event as a temporal unfolding instead of a stable spatial identity) and the material (as an intensive differentiation that stems from the virtual principle of creativity of matter), incorporate ‘the imperceptible’ as a futurity that escapes recognition. In terms of software, this reference to non-human forces and to imperceptibility is relevant on at least two levels. Software is not (solely) visual and representational, but works through a logic of translation. But what is translated (or transposed) is not content, but intensities, information that individuates and in-forms agency; software is a translation between the (potentially) visual interface, the source code and the machinic processes at the core of any computer. Secondly, software art is often not even recognized as ‘art’ but is defined more by the difficulty of pinning it down as a social and cultural practice. To put it bluntly, quite often what could be called software art is reduced to processes such as sabotage, illegal software actions, crime or pure vandalism. It is instructive in this respect that in the archives of the Runme.org software art repository the categories contain less references to traditional terms of aesthetics than to ‘appropriation and plagiarism’, ‘dysfunctionality’, ‘illicit software’ and ‘denial of service’, for example. One subcategory, ‘obfuscation’, seems to sum up many of the wider implications of software art as resisting identification.[i]

However, this variety of terms doesn’t stem from a merely deconstructionist desire to unravel the political logic of software expression, or from the archivists nightmare á la Foucault/Borges, but from a poetics of potentiality, as Matthew Fuller (2003: 61) has called it. This is evident in projects like the I/O/D Webstalker browser and other software art projects. Such a summoning of potentiality refers to the way experimental software is a creation of the world in an ontogenetic sense. Art becomes ‘not-just-art’ in its wild (but rigorously methodological) dispersal across a whole media-ecology. Indeed, it partly gathers its strength from the imperceptibility so crucial for a post-representational logic of resistance. As writers such as Florian Cramer and Inke Arns have noted, software art can be seen as a tactical move through which to highlight political contexts, or subtexts, of ‘seemingly neutral technical commands.’ (Arns, 3)

Arns’ text highlights the politics of software and its experimental and non-pragmatic nature, and resonates with what I outline here. Nevertheless, I want to transport these art practices into another philosophical context, more closely tuned with Deleuze, and others able to contribute to thinking the intensive relations and dimensions of technology such as Simondon, Spinoza and von Uexküll. To this end I will contextualise some Deleuzian notions in the practices and projects of software and net art through thinking code not only as the stratification of reality and of its molecular tendencies but as an ethological experimentation with the order-words that execute and command.

The Google-Will-Eat-Itself project (released 2005) is exemplary of such creative dimensions of software art. Authored by Ubermorgen.com (featuring Alessandro Ludovico vs. Paolo Cirio), the project is a parasitic tapping in to the logic of Google and especially its Adsense program. By setting up spoof Adsense-accounts the project is able to collect micropayments from the Google corporation and use that money to buy Google shares – a cannibalistic eating of Google by itself. At the time of writing, the project estimated that it will take 202 345 117 years until GWEI fully owns Google. The project works as a bizarre intervention into the logic of software advertisements and the new media economy. It resides somewhere on the border of sabotage and illegal action – or what Google in their letter to the artists called ‘invalid clicks.’ Imperceptibility is the general requirement for the success of the project as it tries to use the software and business logic of the corporation through piggy-backing on the latter’s modus operandi.

What is interesting here is that in addition to being a tactic in some software art projects, the culture of software in current network society can be characterised by a logic of imperceptibility. Although this logic has been cynically described as ‘what you don’t see is what you get’, it is an important characteristic identified by writers such as Friedrich Kittler. Code is imperceptible in the phenomenological sense of evading the human sensorium, but also in the political and economic sense of being guarded against the end user (even though this has been changing with the move towards more supposedly open systems). Large and pervasive software systems like Google are imperceptible in their code but also in the complexity of the relations it establishes (and what GWEI aims to tap into). Furthermore, as the logic of identification becomes a more pervasive strategy contributing to this diagram of control, imperceptibility can be seen as one crucial mode of experimental and tactical projects. Indeed, resistance works immanently to the diagram of power and instead of refusing its strategies, it adopts them as part of its tactics. Here, the imperceptibility of artistic projects can be seen resonating with the micropolitical mode of disappearance and what Galloway and Thacker call ‘tactics of non-existence’ (135-136). Not being identified as a stable object or an institutional practice is one way of creating vacuoles of non-communication though a camouflage of sorts. Escaping detection and surveillance becomes the necessary prerequisite for various guerrilla-like actions that stay ‘off the radar.’

Culture Synchronised: Remixes with Nick Cook and Eclectic Method


The room Hel 252 is starting to have good karma as the remix-class room at Anglia Ruskin. Not because its equipped with computers, editing equipment or such, but because it is starting to have a good track record as the room where we have now hosted both the screening and discussion of RIP: Remix Manifesto with Brett Gaylor, and now also discussed the work of Eclectic Method — one of the most well known remix-acts.

Geoff Gamlen, a founding member of Eclectic Method, visited us in the context of Professor Nicholas Cook’s talk on musical multimedia. Professor Cook continued themes that were addressed already in his 1998 book on the topic and now followed up in the form of a new book project that
deals with performance. With a full room of excited audience, Cook gave a strong presentation on hot topics in musicology and the need to move to new areas of investigation, as well as showing how such ideas relate to the wider field of cultural production in the digital age. Remix-culture is not restricted to music but where such examples as Eclectic Method (or we could as well mention for example Girl Talk) are emblematic of software driven cultural production that ties contemporary culture with early 20th century avant-garde art practices, and shows how political economy of copyright/copyleft, of participatory and collaborative modes of sharing and producing, of aesthetics of image/sound-collages and synchronisations, all are involved in this wider musical assemblage. What Cook argued in terms of musicological approaches that, in my own words, are suggesting “the primacy of variation” was apt. Such performance practices as Eclectic Method’s are important in trying to come up with up-to-date understanding of what is performance, what is the author, and how performance practices relate to wider media cultural changes that are as much about the sonic, as they are about pop cultural aesthetics — hence the examples on Tarantino were apt in the presentation. We need to move on (whether in terms of the epistemic frameworks or the legal ones) from the 19th century romantic notion of the Creator as the source of the artwork to what I would suggest (in a kinda of a Henry Jenkins sort of way) to an alternative 19th century of folk cultures where sharing and participating was the way culture was distributed, and in continuous variation. Despite the increasing amount of skeptics from Andrew Keen to Jaron Lanier (and in a much more interesting fashion Dmytri Kleiner), who also rightly so remind us that Web 2.0 is not only about celebration of amateur creativity and sharing but a business strategy that compiles free labour through website bottlenecks into privatized value, I would suggest that there is a lot to learn from such practices of creation as remixing and their implications for a theoretical understanding of musical and media performance.

Eclectic Method’s work…range from political remixes…

…to pop/rock culture synchronisations…

Security and Self-regulation in Software Visual Culture


“Not long ago it would have been an absolutely absurd action to purchase a television or acquire a computer software to intentionally disable its capabilities, whereas today’s media technology is marketed for what it does not contain and what it will not deliver.” The basic argument in Raiford Guins’ Edited Clean Version is so striking in its simplicity but aptness that my copy of the book is now filled with exclamation marks and other scribblings in the margins that shout how I loved it. At times dense but elegantly written, I am so tempted to say that this is the direction where media studies should be going if it did not sound a bit too grand (suitable for a blurb at the back cover perhaps!).

I shall not do a full-fledged review of the book but just flag that its an important study for anyone who wants to understand processes of censorship, surveillance and control. Guins starts from a theoretical set that contains Foucault’s governmentality, Kittler’s materialism and Deleuze’s notion of control, but breathes concrete specificity to the latter making it really a wonderful addition to media studies literature on contemporary culture. At times perhaps a bit repetitive, yet it delivers a strong sense of how power works through control which works through technological assemblages that organize time, spatiality and desire. For Guins, media is security (even if embedding Foucault’s writings on security would have been in this context spot on) — entertainment media is so infiltrated by the logic of blocking, filtering, sanitizing, cleaning and patching (all chapters in the book) that I might even have to rethink my own ideas of seeing media technologies as Spinozian bodies defined by what they can do…Although, in a Deleuzian fashion, control works through enabling. In this case, it enables choice (even if reducing freedom into a selection from pre-defined, preprogrammed articulations). Control is the highway on which you are free to drive as far, and to many places, but it still guides you to destinations. Control works through destinations, addresses — and incidentally, its addresses that structure for example Internet-“space”.

Guins’ demonstrates how it still is the family that is a focal point of media but through new techniques and technologies. Software is at the centre of this regime – software such as the V-Chip that helps parents to plan and govern their children’s TV-consumption. Guins writes: “The embedding of the V-Chip within television manifests a new visual protocol; it makes visible the positive effects of television that it enables: choice, self-regulation, interaction, safe images, and security.” What is exciting about this work is how it deals with such hugely important political themes and logics of control, but is able to do it so immanently with the technological platform he is talking about. Highly recommended, and thumbs up.