Archive
>Nick Cook talk on Beyond reference: Eclectic Method’s music for the eyes
>Another ArcDigital and CoDE talk coming up…
Professor Nicholas Cook, Cambridge University:
Beyond reference: Eclectic Method’s music for the eyes
Date: Tuesday, 11 May 2010
Time: 17:00 – 18:15
Location: Anglia Ruskin University, East Road, Cambridge, room Hel 252
Screen media genres from Fantasia (1940) to the music video of half a century later extended the boundaries of music by bringing moving images within the purview of musical organisation: the visuals of rap videos, for example, are in essence just another set of musical parameters, bringing their own connotations into play within the semantic mix in precisely the same way as do more traditional musical parameters. But in the last two decades digital technology has taken such musicalisation of the visible to a new level, with the development of integrated software tools for the editing and manipulation of sounds and images. In this paper I illustrate these developments through the work of the UK-born but US-based remix trio Eclectic Method, focussing in particular on the interaction between their multimedia compositional procedures and the complex chains of reference that result, in particular, from their film mashups.
Professor Nicholas Cook is currently Professor of Music at the University of Cambridge, where he is a Fellow of Darwin College. Previously, he was Professorial Research Fellow at Royal Holloway, University of London, where he directed the AHRC Research Centre for the History and Analysis of Recorded Music (CHARM). He has also taught at the University of Hong Kong, University of Sydney, and University of Southampton, where he served as Dean of Arts.
He is a former editor of the Journal of the Royal Musical Association and was elected a Fellow of the British Academy in 2001.
http://en.wikipedia.org/wiki/Nicholas_Cook
The talk is organized by the Cultures of the Digital Economy Institute at Anglia Ruskin University and the Anglia Research Centre in Digital Culture (ArcDigital).
The talk is free and open for all to attend.
>Does Software have Affects, or, What Can a Digital Body of Code Do?
>I am going to attach here an abstract I submitted for a conference today — the Deleuze studies conference in Amsterdam. Its something I did for a book coming out soonish, on Deleuze and Contemporary Art:
Can software as a non-human constellation be said to have “affects”? The talk argues that as much as we need mapping of the various affects of organic bodies-in-relation in order to understand the modes of control, power and production in the age of networks, we need a mapping of the biopolitics of software and code too. If we adopt a Deleuze-Spinozian approach to software we can focus on the body of code as a collection of algorithms to bodies interacting and affecting each other. What defines a computational event? The affects it is capable of. In a parallel sense as the tick is defined through its affects and potentials for interaction, software is not only a stable body of code, but an affordance, an affect, a potentiality for entering into relations. This marks moving from the metaphoric 1990s cyberdiscourse that adopted Deleuzian terms like the rhizome into a different regime of critique that works through immanent critique on the level of software. This talk works through software art to demonstrate the potentials in thinking software not as abstract piece of information but as processes of individuation (Simondon) and interaction (Deleuze-Spinoza). A look at software practices and discourses around net art and related fields offers a way of approaching the language of software as a stuttering of a kind (Jaromil). Here dysfunctionalities turn into tactical machines that reveal the complex networks software are embedded in. Software spreads and connects into economics, politics and logics of control society as an immanent force of information understood in the Simondonian sense. The affects of software do not interact solely on the level of programming, but act in multiscalar ecologies of media which are harnessed in various hacktivist and artist discourses concerning the politics of the Internet and software.
Encountering (only as a website though) today the Sonicity-installation project I continued thinking about this. The project turns light, humidity and other environmental data such as people into input for algorithmic sonification through MAX MSP and further to visualisation.
What intrigues me in this is the process of transformation and transposition of various sensory regimes; translations from input into data and further to sound, image, etc. This somehow connects for me to considerations of affect (bodies in relationality, a variety of heterogeneous bodies) as well as the materiality of code data as well (especially becoming sonorous, visible, and hence touching human bodies directly too). “The changing data is what affects what you see and experience. Live XML feeds are ciming from the real time sensors.. The sensors monitor temperature, sounds, noise, light, vibration, humidity, and gps. The sensor network takes a constant stream of data which is published onto an online environment where each different interface makes representations of the XML.” (Sonicity-website).
Naturally, such transpositions could be connected to earlier avant-garde synaesthesia; people such as László Moholy-Nagy’s explorations into the interconnectedness of sound with visual regimes is exemplary here (see Doug Kahn’s Noise Water Meat, p. 92-93), especially when the point about synaesthesia not only as an aesthetic category but irreducibly laboratorial is made clear. Such synthetic processes that make us think about the interrelations of heterogeneous sensations and their sources work through the new technologies and sciences of sound and perception. Indeed, if code/sofware has affects — that is not anymore sillier question than “I wonder how your nose will sound” (Moholy-Nagy).
Operational Management of Life

Management of life — in terms of processes, decisions and consequences — is probably an emblematic part of life in post-industrial societies. Increasingly, such management does not take place only on the level individuality, but dividuality — i.e. managing the data clouds, traces, and avataric transpositions of subjectivity in online environments. This is the context in which J. Nathan Matias’ talk on operational media design made sense (among other contexts of course), and provided an apt, and exciting, example of how through media design we are able to understand wider social processes.
Nathan addressed “operationalisation” as a trend that can be incorporated in various platforms from SMS to online self-management and operationalisation. More concretely, “operational media” can be seen as a management, filtering and decision mechanism that can be incorporated into services and apps of various kinds. Nathan’s talk moved from military contexts of “command and control” (operationalisation of strategic ways into tactical operations) to such Apps as the blatantly sexist Pepsi Amp up before you score which allowed the (male) user to find “correct” and functional responses to a variety of female types. In addition to such, Nathan’s talk was able to introduce the general idea of computer assisted information retrieval and management which to me was a great way of branding a variety of trends into “operational media”. He talked about visualisation of data, augmented reality, filtering of data, expert, crowd and computer assisted information gathering, and a variety of other contexts in which the idea works.
“Should I eat this croissant” considering its calories, the needed time I need to work out to get it again out of my system, the time available etc. is one example of operationalisation of decisions in post-fordist societies of high-tech mobile tools that tap into work and leisure activities.
Another example is the service offered by Nathan’s employed KGB (not the spies, but Knowledge Generation Bureau. See their recent Superbowl ad here. The KGB service is one example of mobile based operational services which in the character space of an SMS try to provide accurate answers to specific questions and hence differ from e.g. search engines.
Of course, one could from a critical theory perspective start to contextualise “operational media”. Is it a form of digital apps enhanced behavioralism that does not only assume but strengthens assumptions about the possibility of streamlining complex human actions? Is it a mode of media design that further distances management of life into external services? Is it hence a form of biopower of commercial kinds that ties in with the various processes from the physiological to cultural such as labour and provides its design-solutions for them? In any case, Nathan’s expertise in this field was a very enjoyable, and a good demonstration of a scholar/designer working in software studies.
Operational Media: Functional Design Trends Online -guest talk
Operational Media: Functional Design Trends Online
Tuesday, February 16, 17.00-18.30, Helmore 252 at Anglia Ruskin, East Road, Cambridge
Two prominent visions have guided the development of Internet technology from its beginning: the never-ending information space of creativity and information; and the networked tool for action. Now that markets for media production and search are saturated and stalling, second generation web tech has shifted focus to media that helps people make decisions and get things done. This lecture provides an introduction to key issues in the information design and software engineering of operational media.
Bio: J. Nathan Matias is a software engineer and humanities academic based in Cambridge, UK. His work focuses on enhancing human capabilities and understanding with digital media. Recent work has included digital history exhibits, work in online documentary, research on visual collaboration, and a visual knowledge startup. He currently spends half of his time as a software engineer on SMS information services for the Knowledge Generation Bureau, and half on digital media projects.
All welcome!


















