Archive
Antikythera
The new Antikythera program headed by Benjamin Bratton launches, and I am happy to be involved as affiliated faculty.
It is a “program reorienting planetary computation as a philosophical, technological, and geopolitical force” and you can check out the website for details as well as how to apply. The selected studio participants (Los Angeles) will be fully supported with a housing provision and a monthly stipend of 4000 USD per month. The program also covers the additional research and field trips.
Astronomical imaging to operational images
A short excerpt from the forthcoming Operational Images book (University of Minnesota Press, 2023), on early 20th century astrophotography as it offers a glimpse into an alternative lineage of operational images, planetary computation, and data/measurement.
“However, discussing Leavitt gives us insight into the work of remote sensing and images that become operationalized for data analysis, in addition to dispelling the often overly male-inventor and scientist-focused narratives (including those concerning operational images). This applies to many of the other contributions too: the photographic plate collection becomes the scaffolding for advanced work in spectral classification and cataloging (Annie Jump Cannon) and remote sensing (calculation of “chemical composition of stars” in Cecilia Payne-Gaposchkin’s work), alongside the Leavitt Law as an example of operational images, too: “a tool to calculate distance in space with the use of Cepheid variable fluctuations.” Leavitt’s and others’ contribution thus is not so much on photography as a particular technology or genre of images but as the basis for an analysis of data and an extraction of features that become significant for questions of remote sensing. Such work becomes part of the lineage of operational images that are less interesting as images on their own but rather as part of a broader infrastructure of skills, labor, techniques, and technologies, and how institutions assemble images for their particular needs and uses.”
[…]
In our example of the astronomical something shifted when light was observed, recorded, and sent across a geographical distance—perhaps a lot of things transformed from the planes of observation and appreciation of high altitude air and clear skies to the photographic exposure and the plates that became one chain along the way before Leavitt, Cannon, and others interpret and synthesize (compute) these into the pithy but important results that ensued: “A straight line can readily be drawn among each of the two series of points corresponding to maxima and minima, thus showing that there is a simple relation between the brightness of the variables and their periods. The logarithm of the period increases by about 0.48 for each increase of one magnitude in brightness.”

Plate 1 and Figure 1 from Leavitt and Pickering’s 1912 article “Periods of 25 Variable Stars in the Small Magellanic Cloud.”
Where is the image here in this diagram, in such a description? On the photographic plates or their analysis, the moments of exposure in the Peruvian landscapes, in the logistics of transporting those images to Cambridge, in the trained analysis of composite photographs, or somewhere else along the way of transforming light? Or, perhaps, the notion of operational ties together a multitude of such material events, sites, and their abstractions and assembles them into a useful notion of the operational image that, as a term, itself is invented much later and for a different purpose, but might itself become useful to speak of infrastructure, logistics, and images that transform from visual to invisual, from ways of seeing to ways of calculating.
Machine Learned Futures
We are with Abelardo Gil-Fournier writing a text or two on questions of temporality in contemporary visual culture. Our specific angle is on (visual) forms of prediction and forecasting as they emerge in machine learning: planetary surface changes, traffic and autonomous cars, etc. Here’s the first bit of an article on the topic (forthcoming later we hope, both in German and English).
“’Visual hallucination of probable events’, or, on environments of images and machine learning”
I Introduction
Contemporary images come in many forms but also, importantly, in many times. Screens, interfaces, monitors, sensors and many other devices that are part of the infrastructure of knowledge build up many forms of data visualisation in so-called real-time. While data visualisation might not be that new of a technical form of organisation of information as images, it takes a particularly intensive temporal turn with networked data that has been discussed for example in contexts of financial speculation.[1] At the same time, these imaging devices are part of an infrastructure that does not merely observe the microtemporal moment of the “real”, but unfolds in the now-moment. In terms of geographical, geological and broadly speaking environmental monitoring, the now moment expands in to near-future scenarios in where other aspects, including imaginary are at play. Imaging becomes a form of nowcasting, exposing the importance of understanding change changing.
Here one thinks of Paul Virilio and how “environment control” functions through the photographic technical image. In Virilio’s narrative the connection of light (exposure), time and space are bundled up as part of the general argument about the disappearance of the spatio-temporal coordinates of the external world. From the real-space we move to the ‘real-time’ interface[2] and to analysis of how visual management detaches from the light of the sun, the time of the seasons, the longue duree of the planetary qualitative time to the internal mechanisms of calculation that pertain to electric and electronic light. Hence, the photographic image that is captured prescribes for Virilio the exposure of the world: it is an intake of time, and, an intake of light. Operating on the world as active optics, these intakes then become the defining temporal frame for how environments are framed and managed through operational images, to use Harun Farocki’s term, and which then operationalize how we see geographic spaces too. The time of photographic development (Niepce), or “cinematographic resolution of movement” (Lumière), or for that matter the “videographic high definition of a ‘real-time’ representation of appearances”[3] are part of Virilio’s broad chronology of time in technical media culture.
But what is at best implied in this cartography of active optics is the attention to mobilization of time as predictions and forecasts. For operations of time and production of times move from meteorological forecasting to computer models, and from computer models to a plethora of machine learning techniques that have become another site of transformation of what we used to call photography. Joanna Zylinska names this generative life of photography as its nonhuman realm of operations that rearranges the image further from any historical legacy of anthropocentrism to include a variety of other forms of action, representation and temporality.[4] The techniques of time and images push further what counts as operatively real, and what forms of technically induced hallucination – or, in short, in the context of this paper, machine learning – are part of current forms of production of information.
Also in information society, digital culture, images persist. They persist as markers of time in several senses that refer not only to what the image records – the photographic indexicality of a time passed nor the documentary status of images as used in various administrative and other contexts – but also what it predicts. Techniques of machine learning are one central aspect of the reformulation of images and their uses in contemporary culture: from video prediction of the complexity of multiple moving objects we call traffic (cars, pedestrians, etc.) to satellite imagery monitoring agricultural crop development and forest change. Such techniques have become one central example of where earth’s geological and geographical changes become understood through algorithmic time, and also where for instance the very rapidly changing vehicle traffic is treated alike as the much slower earth surface durations of crops. In all cases, a key aspect is the ability to perceive potential futures and fold them into the real-time decision-making mechanisms.
The computational microtemporality takes a futuristic turn; algorithmic processes of mobilizing datasets in machine learning become activated in different institutional context as scenarios, predictions and projections. Images run ahead of their own time as future-producing techniques.
Our article is interested in a distinct technique of imaging that speaks to the technical forms of time-critical images: Next Frame Prediction and the forms of predictive imagining employed in contemporary environmental images (such as agriculture and climate research). While questions about the “geopolitics of planetary modification”[5] have become a central aspect of how we think of the ontologies of materiality and the Earth as Kathryn Yusoff has demonstrated, we are interested in how these materialities are also produced on the level of images.
Real time data processing of the Earth not as a single view entity, but an intensively mapped set of relations that unfold in real time data visualisations becomes a central way of continuing the earlier more symbolic forms of imagery such as the Blue Marble.[6] Perhaps not deep time in the strictest geological terms, agricultural and other related environmental and geographical imaging are however one central way of understanding the visual culture of computational images that do not only record and represent, but predict and project as their modus operandi.
This text will focus on this temporality of the image that is part of these techniques from the microtemporal operation of Next Frame Prediction to how it resonates with contemporary space imaging practices. While the article is itself part of a larger project where we elaborate with theoretical humanities and artistic research methods the visual culture of environmental imaging, we are unable in this restricted space to engage with the multiple contexts of this aspect of visual culture. Hence we will focus on the question of computational microtime, the visualized and predicted Earth times, and the hinge at the centre of this: the predicted time that comes out as an image. The various chrono-techniques[7] that have entered the vocabulary of media studies are particularly apt in offering a cartography of what analytical procedures are at the back of producing time. Hence the issue is not only about what temporal processes are embedded in media technological operations, but what sounds like merely a tautological statement: what times are responsible for a production of time. What times of calculation produce imagined futures, statistically viable cases, predicted worlds? In other words, what microtemporal times are in our case at the back of a sense of a futurity that is conditioned in calculational, software based and dataset determined system?
[1] Sean Cubitt, Three Geomedia, in: Ctrl-Z 7, 2017.
[2] Paul Virilio, Polar Inertia, London-Thousand Oaks-New Delhi, 2000, S. 55.
[3] Ebenda, S.61
[4] See Joanna Zylinska, Nonhuman Photography, Cambridge (MA) 2017.
[5] Kathryn Yusoff, The Geoengine: geoengineering and the geopolitics of planetary modification, in: Environment and Planning a 45, 2013, S. 2799-2808.
[6] See also Benjamin Bratton, What We Do Is Secrete: On Virilio, Planetarity and Data Visualisation, in: John Armitage/Ryan Bishop (Hg.), Virilio and Visual Culture, Edinburgh 2013, S. 180–206, hier S. 200-203.
[7] Wolfgang Ernst, Chronopoetics. The Temporal Being and Operativity of Technological Media, London-New York 2016.
Surface Prediction
I am giving a talk in Paris at the École Normale Supérieure and using it as an opportunity to present some new work. This writing stems from some collaborative work with artist Abelardo Gil-Fournier with whom we ran a collective workshop at transmediale on Surface Value . The practice-led workshop was set in the context of our larger discussion on surfaces, media and forms of valuation that pertain both to military and civilian spheres of images (such as aerial imaging) and continuing it in relation to contemporary forms of machine learning and neural networks that take their data from geographical datasets. Hence we are working on this question of prediction as it pertains to geographical and geological surfaces and how these forms of images (from time-lapse to prediction) present a special case for both financial uses of such predictive services and also their experimental angle as forms of moving image – experimental “video” art on a large scale.
Here’s a further excerpt from the talk that also draws on work by Giuliana Bruno, Lisa Parks, Caren Kaplan, Ryan Bishop and many others:
What I want to extract from this research platform that Gil-Fournier’s work offers are some speculative thoughts. At the basis of this is the idea that we can experiment with the correlation of an “imaged” past (the satellite time-lapses) with a machine generated “imaged” future and to test how futures work; how do predicted images compare against the historical datasets and time-lapses and present their own sort of a video of temporal landscapes meant to run just a bit ahead of its time. Naturally would easily risk naturalising things that are radically contingent: mining operations, capital investments, urban growth and financial valuations, geopolitical events, and such. But instead of proposing this as naturalisation, it works to expose some of the techniques through which landscapes are flattened into such a surface of not only inscription of data, but also images in movement. Here, the speculative is not some sort of a radically distinguished practice that stands out as unique aberration but increasingly the modus operandi and the new normal of things (Bratton 2016, 2017). What’s interesting is that it spreads out to a variety of fields: the image becomes a speculative one, with interesting implications how we start to think of video; it is also a financial one, as such data-feed mechanisms are also part of what Cubitt describes as one of the forms of geomedia; and it is about landscapes, as they are part of the longer lineage of how we read them as informational signs.
It’s here that the expanded image of a landscape is also embedded in a machine learning environment which also feed as part of financial environments. There are multiple ways how the ecology of images in machine learning works with time – the form of moving image that is the timelapse is also faced with the temporal image of predictions. The technical basis of digital video becomes one reference point for where to start unfolding the other sides of AI as machine learning: this is post-digital culture also in this sense, where not only images of earth surfaces change in view of the data analytics, but the aesthetic contexts of analysis – namely, moving image and video that feed forward (cf. Hansen).
[Image from Abelardo Gil-Fournier’s workshop materials].
Narratives of a Near Future: Air
In December 2017, I gave one of the invited talks at the Geneva art school, HEAD. Under the main rubric of Narratives of Near Future, we were invited to address the Anthropocene. Mousse magazine wrote a review of the event and m y talk on air (and featuring a bit of Talking Heads) is now online and found here:
The Mediocene And the Lab Scene
The Mediocene conference takes place later this month in May in Weimar. Organized by the IKKM, the conference picks up on the Anthropocene from a specific media-focused vantage point. In the organizers words, “The concept of the Mediocene […] sees media and medial processes as epoch-making. As a determining force, they leave their permanent imprint on the world, affecting animate and inanimate nature alike — human existence, technology, society, and the arts as well as the shape, organization and history of the global habitat itself.”
My take draws on our current laboratory-project, and below is a short (draft!) text of the beginning of the talk still in the process of writing and without a full range of references. The idea of the talk is to set the laboratory as this particular term, an imaginary and a fever around which multiple scales of planetary media come to the fore. It will also discuss topics especially in the art and technology-nexus including briefly the emergence of art labs in the Cold War institutions of technical media (a topic that will be well covered by Ryan Bishop and John Beck in their new work), as well as experimental work in the arts about the lab, including Bureau D’Etudes on the Laboratory Planet as well as probably such work as Neal White’s on post-studio. Any further thoughts, tips and ideas are warmly appreciated.
The Lab is the Scene
One could be forgiven for thinking that the world’s nothing but a lab. From endorsing the centrality of the factory as a key site to understand modernity and as the site of production, material transformation, commodity culture, labour relations, pollution and what not, the laboratory seems to have in some accounts taken a similar role. It speaks to a range of topics of media and culture: historically, a central place of inventing and engineering technical media; thematically, one crucial vantage point for the multi-scalar operations that define the tie between the planetary (dis)order and its situated practices. It does however come with a legacy that is only partly about the science lab. Indeed, the other important lineage relates to the technology, engineering and design/art labs that throughout the 20th century started to offer a parallel narrative: experimentation, a demo or die-attitude (at the MIT Media Lab, see for example Halpern 2015), prototyping, and more. Hence this lab story of experimental culture is not restricted to the science lab as if a separate entity from the arts and humanities; and in any case, the science labs of many kinds have already had their fair share of attention from social scientists and humanities scholars, even post-studio artists up until the recent days with the continuing enthusiasm for CERN residencies.
The proliferation of laboratories outside the strict confines of the science lab seems to have taken place with the entry of a range labs of different kinds: design labs, maker labs, hack labs, media archaeology labs, studio-labs, digital humanities labs, humanities, critical humanities labs, media labs and critical media labs – and then, fashion labs, brew labs, coffee labs, gadget labs, creativity labs, the list goes on. The usual thought would be that this is part of the metaphoric inflation of the meaning, site, scientificity of the laboratory that brands a particular attitude to postmodern culture. Of course, as Henning Schmidgen echoing the likes of Peter Galison and others points out, “the laboratory is undergoing a process of dissolution and dispersal,” with the massive distributed networks that constitute the laboratory now (think of the Human Genome project, think of CERN) but this dissolution and dispersal happens on other levels too, as the examples pertaining to humanities and media labs demonstrate. There’s almost nothing that could not be a lab. But perhaps the lab is itself symptom more than the answer, and as such, a trigger to consider issues of the mediocene in art and technology; issues such as scales of data, infrastructure and different methodologies. It becomes itself a rather fluidly moving term not merely designating a particular specialist place but also a particular project about the lab imaginary. Here, the notion of the project is crucial due to its future-oriented sense.
A focus only on the most recent would miss the point how the laboratory was already early on a contested term – especially when going on in the pre-scientific laboratories and their heterogeneous sets of spaces and practices that avoid too easily to be pinned down only as steps towards the perfection of a form – but the problem about the term persisted also later, during the emergence of the science laboratory.
As historians of science have noted, the lab as elaboratory was one formative way of understanding what then went on in the early modern spaces preceding labs. Elaborating materials for medicine and chemistry, working with the variety of materials in ways that was not merely under human control: the) elaboratory was a place where to let things go their way, even if offering a stage by way of the thermomedia control (see Nicole Starosielski’s work on temperatures and media) that allowed the transformations to be accelerated from earth time to lab time. Interestingly enough, such a broader understanding of labs and elaboration in relation to natural formations persisted; Sir Humphry Davy’s voiced in 1815 that “the soil is the laboratory in which the food is prepared.” In 1860 in a very different scientific context regarding the Physical Geographies of the Sea, Louis Ferdinand Alfred Maury spoke of the sea as the “a laboratory in which wonders by processes the most exquisite are continually going on”, as a sort of an model for understanding atmospheric movements even.
Indeed, reverse from our current laboratory fever some 100 years and a bit more, and shift the focus to Bangor in Wales where Sir William Thomson, 1st Baron Kelvin of, indeed, the kelvin fame of temperature measurement but also having worked with maritime compasses among any other things crucial for planetary media. Thomson opening the new spaces of physics and Chemistry labs in 1885 at University of Bangor seemed to be offering a rather extended way of understanding the topic. Let’s quote him:
“The laboratory of a scientific man is his place of work. The laboratory of the geologist and naturalist is the face of this beautiful world. The geologist’s laboratory is the mountain, the ravine, and the seashore. The naturalist and the botanist go to foreign lands, to study the wonders of nature, and describe and classify the results of their observations.”
Thomson was no mere romantic fool of course, but a man of modern science. He was not haunted by a romantic longing to a past of gentleman travels across the planet observing this beauty of nature but more of a pragmatist. Also the field research must be tightly linked to the possibilities of the lab, its equipment and its techniques, so as to ensure there is a tight connection between the insides and the outsides (Gooday 790). A properly equipped lab is what ensures that the field itself becomes an extended part of the technical apparatus, a laboratory that spans across the territories of the planet. A lab is where scales meet, to remind of the ways in which Bruno Latour spoke of Louis Pasteur for example.
For a longer period medicine, chemistry and metallurgy, and then of course physics remained the central disciplines of the laboratory (see Gooday, Schmidgen 2011). 20th century brought technological laboratories into the scene: engineering and material labs, electronics labs to the varieties of other forms of centralised facilities that systematised the production of engineered culture. Much before there were things called media labs, labs were essential to media to become what they became in relation to the actual apparatuses as well as their impact on the thresholds of perception. Labs were one sorts of conditions for much of that work that came to be called media. Many of the labs in engineering were the institutions central to the backbone of various national and international infrastructures such as the Bell Labs, the centrality of “innovation labs” from Menlo Park to many others, and of course, the centrality of the art and technology labs of the Cold War that themselves were the grounding of so much of what we call now “media arts” and where the particular techniques of speculative, experimental use meet up with the other sort of speculative that is attached to forms of value creation.
The lab as place, invention and extension of “media” is part of the continuum of the technological work in labs and the artistic practices as one background to the notion of experimentation. The media and arts approaches produce a particular discourse, a particular stance on the experiment, but also in some cases a corporate take on a speculative mapping of scales that reach out spatially to planetary infrastructures as much as local scales as well as to the future-oriented dimension. Here, I believe there’s a way in which it resonates with the question of the Anthropocene as one of scales that map out the lab as something of an epistemological and medial arrangement that spans further than its space. This happens both discursively and in terms of its objects of knowledge: emerging from the Cold War period art-technology labs, or the studio-lab, it also becomes a scene where the continuum between technological culture and its creative practices are put into a conversation, creating the particular scene and the fantasy of visionary future-oriented experimental work inventing the media worlds to come. The Mediocene is this particular aesthetic-technological framing of scales (temporal, spatial, potential, not-yet actualised, speculative) and quite often, also in this arts-technology nexus it does happen through the hinge of the lab. Now, using the term, as is clear from already now, I am forced to ignore many current examples that also use the term in other ways that I will narrate in this talk. The term has multiple uses and as such, my version does not do justice to the full plethora of labs of critical, experimental practice as much as it connects the term to a particular different sort of a genealogy. Hence, bear with me, as I sketch some ideas.
On Lunacy – a radio piece
I was one of the Nordic artists (even if as a theorist) commissioned for the 2017 Works for Radio for the radio station The Lake. Based in Copenhagen, and transmitting via the Internet, the station asked the four commissioned artists to produce a piece between one and eight minutes in length and to relay the invite to four other Nordic artists.
The works are launched on Saturday 14th of January in Copenhagen and can be then listened as part of the programme flow at http://www.thelakeradio.com/.
You can download my piece, On Lunacy, here.
The original call for artists described the idea behind the commission:
“Radio art as a genre has a long tradition in the European public service institutions. Especially in the 1960s, the different national broadcasters commissioned new works from artists, writers, and composers made specifically for radio. This practice has declined over the years, and in Denmark it is almost lost. As a radio station The Lake wants to revive this tradition! Furthermore we want to bring more sounds into the aether, that are not necessarily music. How can art for radio sound? Through the project Works for Radio, The Lake is commissioning eight new sound pieces from Nordic artists.”
A short context for my piece is described below.
On Lunacy
The piece is a speculative theory performance for radio. Jussi Parikka’s text and reading together with artist, researcher Dr Jane Birkin (Southampton) starts with a reference to the German media theorist Walter Benjamin’s (1892-1940) radio play Lichtenberg (1933). On Lunacy is less a commentary on Benjamin’s play than an attempt to bring some of the themes into a discussion with contemporary issues of politics, technology and ecology. It starts with the roar of approval at the Tory conference in October 2016, after the prime minister Theresa May dismisses the work of human rights lawyers and activists. This roar is chilling and it resonates across many countries as a wave of populist, destructive contempt that takes different, varying forms in Europe, the USA, Russia, Turkey, etc.
On Lunacy discusses the media technological conditions of politics of voice and lack of voice, of what is heard and what is too painful to listen to. It enters into a discussion with a range of current debates about media technological transmission and interception, as well as nods to many relevant contexts in the history of radio too – from Orson Welles’ War of the Worlds (1938) to the satellite broadcasts since the 1960s. I also had in mind the various sonic and artistic voices of past decades; from Gil Scott-Heron’s Whitey on the Moon (1970) to the various sounds of Afrofuturism since the 1970s to name some even if they do not feature as explicit references in this work.
Radio is approached not merely as a medium of entertainment but one of military communication as well as the tactics of misinformation, confusion and mind control. Radio persists and is constantly reinvented, and the signal worlds persist in and out of the planet. The narrative trope of the moon and the interplanetary play a key role in this theoretical voice piece, but also offers a way to resurface back to the contemporary politics that features the return of the mainstream acceptance xenophobia, racism and politics of violence against particular ethnicities as well as the ecocide that haunts the contemporary moment.
On Lunacy ends with the recurring, burning question of politics (that also for example the French philosophers Gilles Deleuze and Felix Guattari posed): why do people desire their own enslavement?
Stephen Cornford offered assistance in editing and post-production of the sound. Thank you also to Ryan Bishop, the co-director of AMT research centre, who offered key thoughts on the cold war contexts of satellites and sound transmission.
The work stems from the new research and practice platform Archaeologies of Media and Technology (AMT) at Winchester School of Art.
Cue in Gil Scott-Heron, Whitey on the Moon.
And the Earth Screamed, Alive
Emma Charles’ exhibition opens in London. It includes a multiscreen version of the White Mountain to which I wrote the text (and performed live at the Miracle Marathon just recently at the Serpentine in London). Please find more information below. The exhibition runs from 21 October to 12 November, with the PV on 20th of October.
South Kiosk is pleased to present And the Earth Screamed, Alive*, a solo exhibition by Emma Charles, featuring a multi screen expanded installation of her 16mm film White Mountain. This fictional documentary focuses on the Pionen Data Center in Stockholm. In 2008, this former Cold War-era civil defense bunker was redesigned by architect Alber France-Lanord as a data center to house servers for clients, which at one point included Wikileaks and The Pirate Bay. By revealing these unseen spaces and people, Charles work explores an understanding of how contemporary life is structured, managed and secured.
Starting by surveying the rough topography of the surrounding Södermalm landscape, Charles gradually pushes beneath the surface, illuminating the ordinarily concealed network infrastructure. As the camera idles on the florescent-lit server stacks, issues of privacy, surveillance and digital sovereignty inevitably emanate. Located 30 meters under the granite rocks of Vita Bergen Park in Stockholm, the hydrogen bomb proof subterranean hub has been constructed with direct references to science fiction films such as Silent Running, and the classic Ken Adams designed Bond-villain lairs.
Playing on the science fiction aesthetic, White Mountain uncovers the varying forms of temporality brought about through an exploration of data space and geology. After a summer punctuated by a constant stream of high-profile hacks the impenetrable steel door and
fortified walls of Pionen now seem like outmoded, symbolic defenses, ineffective at curbing the allpervading data anxiety brought about by the relentless assault of cybercriminals, spammers and clandestine state-agents.
South Kiosk has invited Emma Charles for And the Earth Screamed, Alive to
transform its space and take the viewer on a journey through the concealed and protected architecture of the data center, through an immersive projection of White Mountain and the display of a further collection of her artwork, this solo presentation focuses on the handling of digital information, the aesthetic that arises from its protection and the engagement and critique that art can perpetuate of these architectures.
For images and further information please contact Toby Bilton info@southkiosk.com
*“And the Earth Screamed, Alive” Jussi Parikka, A Geology of Media, University of Minnesota Press (2015).
The Earth (for the Posthuman Glossary)
I am writing some entries (“Anthropocene”, “Medianatures”, “The Earth”) for the forthcoming Posthuman Glossary, edited by Rosi Braidotti and Maria Hlavajova. The project and some of the entries were the topic of seminars during May/June in Utrecht in a row of seminars, and the book I believe is expected to be out later in 2016.
Here’s one of the text – although in draft form (and not copy edited); a short text on the Earth. One can say topical for so many reasons: issues of climate change/disaster, as well as the perhaps linked enthusiastic discovery of Earth-like planets outside our solar system – a recurring theme in our current public discourse about space and science.
The Earth
The Earth is a planet, of an age of about 4.54 billion years and defined by its geological formations, density, biosphere, hydrosphere and an atmosphere that sustains life. It’s more than a world for humans but an Earth that is defined by its life-sustaining conditions and its planetary relations (Woodard 2015). On a planetary level, it is one complex dynamic system where biosphere, atmosphere and many of the geological spheres interact; on an extra-planetary level it is as dynamic, part of the gravitational pull, periodic rotation, cosmic rays and the radiation of the sun. Buckminster Fuller coined it “spaceship earth” marking the speculative beginnings of post-planetary design: ‘We are all astronauts’ (Fuller 1969: 14) who spin in space traveling 60 000 miles an hour, in the midst of rich non-human life as well as the intensive relations to other planets and the sun.
The Earth is also a complex ecosystem where one should never mistake humans to be the centre of action but merely one part in a larger loop of processes. One way to refer to it is by way of a ‘holarchy arisen from the self-induced synergy of combination, interfacing, and recombination’ (Margulis and Sagan 1995:18).
Besides the life of the organic and the inorganic spheres, it is also a mediasphere by which we don’t have to think only of the Jesuit fantasies of the immaterial reality of cognition such as Teilhard de Chardin did–or what cyber culture then rehashed with a dose of Silicon Valley excitement–but the different visualisation systems that give us operational representations of the planet. This is the view of the Earth since the Vostok I-space flight in 1961: the first human that is orbiting the planet and able to describe the ground-detached view. It’s the Earth that features in the cover of the first Whole Earth Catalogue in 1968, and in the inside pages hailing the imagery of the satellite era: the necessary coffee table book of 243 NASA images, in full color, from the Gemini flights in 1965—for only $7. The Earth furnishes the home.
Our understanding of the Earth is mediated by a variety of representational techniques and is itself a product of the technological era. ‘They alone shall possess the earth who live from the powers of the cosmos’, quoted Walter Benjamin (2008: 58) in his short text ‘To the Planetarium’ from 1928, analysing technological ways of organising the physis – both the gaze upwards, and from up there, back downwards. The satellite based images of the Earth since 1960s and leading to the famous Blue Marble of 1972 (Apollo 17-flight) mark subsequent examples in the series of images that define the Earth from the space. The escape velocity (Virilio 1997) that allows accelerating objects from airplanes to space ships to leave the Earth’s gravity bound surface is also what then allows us to see the Earth from above. The old etymology of the Earth as eorþe referring to something different from the heavens and the underground gives way to a dynamic of vectors where the Earth becomes defined from the heavens. The energetic powers of acceleration transform into the visual survey from above. As Fuller puts it, writing in late 1960s, ‘However, you have viewed more than did pre-twentieth-century man, for in his entire lifetime he saw only one-millionth of the Earth’s surface.’ This media-enhanced understanding of the Earth seeps into the biological work of Margulis and Sagan even, when they narrate the new metamorphosis of visual epistemology that this technological thrusting and imaging brings about. It brings forth an imaginary of the orbital that is shared by satellites and astronauts: ‘As if floating dreamily away from your own body, you watch the planet to which you are now tied by only the invisible umbilicus of gravity and telecommunication.’ (Margulis and Sagan 1995: 18). They use such images and narratives to contribute to the idea of holarchic view where the human is part of the micro- and macrocosms. For them, the event is a sort of a planetary level mirror image that carries Jacques Lacan concept from babies to space: to perceive ‘the global environment’ as the ‘mirror stage of our entire species’ (Ibid.)
Much more than an echo of psychoanalytic stage for the planetary design, the mediated vision turned back on the earth itself was instrumental to a range of political, scientific and military considerations. Seeing the Earth from space was one such thing that had an effect on climate research (also impacted by the nuclear testing, see Edwards 2010). It had an effect on military planning and geopolitical evaluation. It opened up again a holistic view of the planet as one although at the same time as a complex system of non-linear kind. It contributed to a variety of cultural moods and movements. Even the gaze to the otherworldly away from the Earth was a way to sharpen the focus on the planet; But the technological gaze toward deep space with telescopes such as Hubble was never just about space and the interplanetary worlds. Geographical surveys benefited from the developed lenses and image processing of satellite-enabled remote sensing. (Cubitt 1998: 45-49) The perspective back to the Earth has enabled the fine-tuning accuracy of corporate digital maps such as Google Earth and a massive military surveillance system too.
The Earth is constantly targeted by satellites and remote sensing systems such as the Planetary Skin institute. The institute is one among many systems that offer polyscalar view of multiciplity of processes for analysis. It boasts with the ideal of reading these as “scalable big data” that benefits communities and can “increase, food, water, and energy security and protect key ecosystems and biodiversity” (quoted in Bishop 2016). Alongside systems as the Hewlett Packard’s Central Nervous System for the Earth (CeNSE) it creates real-time surveillance systems that intend more than mere observation. As Ryan Bishop (ibid.) argues, these are massive level systems for constant data-based interpretation of the various scales of the Earth that indeed define a specific corporate-security angle on a planetary scale.
Our relations with the Earth are mediated through technologies and techniques of visualization, sonification, calculation, mapping, prediction, simulation, and so forth: it is primarily through operationalized media that we grasp the Earth as an object of analysis. Even the surface of the earth and geological resources used to be mapped through surveys and field observation. But now this advances through remote sensing technologies (see also Parikka 2015). One can argue that they are in a way extensions of Leibniz’s universal calculus, which offered one way to account for the order of the earth, including its accidents like earthquakes (such as the infamous 1755 in Lisbon). But as the architect-theorist Eyal Weizman argues, this calculation of the Earth is now less organized according to the divine order of Christian Deity and more about the “increasingly complex bureaucracy of calculations that include sensors in the subsoil, terrain, air, and sea, all processed by algorithms and their attendant models.” (Weizman, Davis, Turpin 2013: 64) Also practices of meteorology are to be understood as such cultural techniques and media operations that order the dynamics of the sky as analyzable data. The terrestrial opens up through what circulates above it, the atmosphere becomes a way to understand the ground and the orbit is where the understanding of the Earth begins by way of massive data-driven remote sensing systems. The nomos of the Earth that defines its geopolitics is one that reaches out to the heavenly spheres as much as to the multi-scalar data-intensive operations (see Bratton 2015).
References
Benjamin, W. (2008) ‘To the Planetarium.’ In Walter Benjamin, The Work of Art in the Age of its Technological Reproducibility and Other Writings on Media. Cambridge, MA: Harvard University Press, 58-59.
Bishop, R. (2016) ‘Felo de se: The Munus of Remote Sensing’. Boundary2, forthcoming (estimated 2016).
Bratton, B. (2015) The Stack. On Software and Sovereignty. Cambridge, MA: The MIT Press.
Cubitt, S. (1998) Digital Aesthetics. London: Sage.
Edwards, P. (2010) The Vast Machine. Computer Models, Climate Data, and the Politics of Global Warming. Cambridge, MA: The MIT Press.
Fuller, B. (1969) ‘Operating Manual for Spaceship Earth’ Online at http://designsciencelab.com/resources/OperatingManual_BF.pdf (originally published in 1968).
Margulis, L. and Sagan, D. (1995) What is Life? New York: Simon & Schuster.
Parikka, J. (2015) A Geology of Media. Minneapolis: University of Minnesota Press.
Virilio, P. (1997) Open Sky. Trans. Julie Rose. London: Verso.
Weizman, E.; Davis, H. and Turpin, E. (2013), “Matters of Calculation: Eyal Weizman in Conversation with Heather Davis and Etienne Turpin,” in Architecture in the Anthropocene: Encounters among Design, Deep Time, Science, and Philosophy, ed. Etienne Turpin. Ann Arbor, Mich.: Open Humanities Press, 63-82.
Woodard, B. (2015) ‘Less World to be Ourselves. A Note on Postapocalyptic Simplification’ E-Flux Supercommunity, August 6, http://supercommunity.e-flux.com/texts/less-world-to-be-ourselves-a-note-on-post-apocalyptic-simplification/.