Interview with Wolfgang Ernst

Archive Rumblings

German media theorist Wolfgang Ernst (1959) is a member of the Berlin circle inspired by Friedrich Kittler and currently founding the Seminar for Media Studies at Humboldt University. He is contributing to the‘media archeology’ school in which new media are traced back to earlier concepts. Following this methodology one reads traces of digital technologies into history, not the other way round. The idea is that there is no teleology in which media unfold themselves in time. Against the usual chronological reading of media, from photo and radio to television and the Internet, Wolfgang Ernst utilizes the Foucaultian ‘archeological’ approach that aims to unveil active power relationships. But whereas Foucault looked into social formations, today’s media archeologists are primarily interested in the (hidden) programs of storage media. Following McLuhan Ernst poses that “cyberspace is not about content, but rather a transversive performance of communication. Without the permanent re-cycling of information, there is no need for emphatic memory.”

In his 2002 book ‘Das Rumoren der Archive’ (Archive Rumblings) Wolfgang Ernst points out that archives are no longer forgotten, dusty places. The archive as a concept has gained universal attention and reached metaphorical glance. In this era of storagemania everything is on record. Repositories are no longer final destinations but turn into to frequently accessed, vital sites. For instance, East-German secret police archives, opened after 1989 and frequently visited, show how contested data collections can become. Wolfgang Ernst signals a shift from the political-military (secret) meaning of (national) archives towards a broader cultural understanding in which the archive stands for ‘collective memory’. For Ernst archives are defined by their ‘holes’ and ‘silent’ documents. Ernst’s annals look like crashing operating systems that should not be taken by face value. In short: archives are cybernetic entities. These days everyone is painfully aware that archiving equals careful selection. Chronicles are anything but neutral collections. Instead they reflect the priorities and blind spots of the archivists and the Zeitgeist they operate in. By now that’s common sense. What can we expect from 21st century archive theory, beyond digitization and database architectures? Will the elites establish safeguarded ‘islands in the Net’ where essential knowledge is stored, leaving the wired billions floating in their own data trash? Has tactical silence and the aesthetics of forgetfulness got to be all-too-obvious responses to storagemania?

GL: One would associate the theoretical interest in archives with Foucault, Derrida and other French authors. You make many references to them. Is that the destiny of our generation, to get stuck in the postmodern canon? Or is there another, more personal reason for your interest in archives and the ‘French’ approach? Do you keep an archive yourself and which archive is your favorite one?

WE: When Peter Gente and Heidi Paris from the Berlin-based Merwe publishing house asked me to write an essay on archives with special regards to French theories, I took that chance since it gave me a possibility to work through my own intellectual past. Having been extremely affected by French post-structuralist theories in the 80s and actually trying to de-construct the notion of text-based history myself, my research year at the German Historical Insitute in Rome then made me “convert” not to Catholicism, but to the acknowledgement of real archives. I then discovered that no place can be more deconstructive than archives themselves, with their relational, but not coherent topology of documents which wait to be reconfigurated, again and again. The archival subject thus is a way out of the one-way postmodern aesthetics of arbitrary “anything goes” – without having to return to authoritarean hermeneutics (a point made as well by the “new historicists” in literary studies, f. e. Stephen Greenblatt). The simple fact is that archives do not only exist in metaphorical ways as described by Foucault and Derrida, but as part of a very real, very material network of power over memory.

Do I keep an archive myself? Have a look at my homepage (www.verzetteln.de/ernst) …
In fact, I keep nothing but an archive at home: no book-shelves, no library, but a modular system of textural, pictorial or even auditory information in movable boxes. That is, among others, fragments of books, distributed according to diverse subjects—liberated from the restrictive book-covers.

GL: How would you describe the methodology of media archeologists? Is it useful to speak of a school in this context? Media archeologists can be found in places such as Cologne (KHM), Berlin (Humboldt University) and Paris. Then there is for instance Lev Manovich who ‘reads’ film history as an episode in the coming into being of new media story. How to look at the field and what interesting approaches have you come across lately?

WE: I owe the term to Siegfried Zielinski, who—as the former director of the Academy of Media Arts in Cologne—once hired me for a research and teaching job called “Theory and Archaeology of Media in the Context of the Arts” (a world-wide premiere as an academic field?). Zielinski himself, of course, owes the term to Michel Foucaults “Archaeology of Knowledge”, but has given it a technological turn in cultural anlaysis, with his brilliant work on the video recorder (Berlin 1986). In his most recent work, literally called “Media Archaeology” (2002), Zielinski advocates an an-archical history of forgotten or neglected media approaches. Different from that liberitarean approach, my version of media archaeology tries to carry further Foucault´s approach (see my book “M.edium F.oucault”, 200). My media archaeology is an archaeology of the technological conditions of the sayable and thinkable in culture, an excavation of evidence of how techniques direct human or non-human utterances – without reducing techniques to mere apparatuses (encompassing, for example, the ancient rules of rhetorics as well).

Media archaology is a critique of media history in the narrative mode. When Lev Manovich (whose writings I appreciate a lot) reads film history as an episode in the coming into being of the new media story, his approach already is trapped by the linear approach of media history. Having been trained as a historian, a classicist (and partly even as a “real” classical archaeologist in the disciplinary sense), I have always felt uneasy with the predominance of narrative as the uni-medium of processing our knowledge of the past. It takes a new infrastructure of communicating realities—the impact of digital media itself –to put this critique of historical discourse into media-archaeological terms and practice. But I have to confess, even when I claim to perform media-archaeological analysis, I sometimes slip back into telling media stories. The cultural burden of giving sense to data through narrative structures is not easy to overcome.

The archaeology of knowledge, as we have learned from Foucault, deals with discontinuities, gaps and absences, silence and ruptures, in opposition to historical discourse, which privileges the notion of continuity in order to re-affirm the possibility of subjectivity. „Archives are less concerned with memory than with the necessity to discard, erase, eliminate“ (Sven Spieker). Whereas historiography is founded on teleology and narrative closure, the archive is discontinuous, ruptured. Like all kinds of data banks, it forms relationships not on the basis of causes and effects, but through networks; the archive – according to Jacques Lacan – leads to an encounter with the real of script-directed culture.

Media archaeology describes the non-discursive practices specified in the elements of the techno-cultural archive. Media archaeology is confronted with Cartesian objects, which are mathematisable things, and let us not forget that Alan Turing conceived the computer in 1937 basically as a machine paper (the most classical archival carrier). Media archeaology is driven by a certain “Berlin school of media studies” obsession with approaching media in terms of their logical structure (informatics) on the one hand and their hardware (physics) on the other – thus different from British and U.S. cultural studies, which analyze the subjective and social effects of media.
The real multi-media archive is the arché of its source codes; multi-media archaeology is storage and re-reading and re-writing of such programs. Media history is not the appropriate medium to confront such an archive. Consider, for example, two examples in current media research: Renaissance computers, edited by Neil Rhodes and Jonathan Sawday. Renaissance Computers expressly draws a parallel between the media revolution from manuscripts to printing in Europe enabled by Johann Gutenberg in 1455 and Martin Luther´s use of printed text for the distribution of protestant messages (theses) in 1517, and the actual digital technology era. The symbolic machines of the sixteenth-century „methodizer“ Peter Ramus are presented as a pendant to the computer of today. This claim still thinks media from the vantage point of alphabetical texts, but audio-visual data banks make all the difference. Against such analogies, media archaeology insists on differences. Computing is not about imagination and texts, but rather the alliance of engineering and mathematics. The coupling of machine and mathematics that enables computers occurs as a mathematization of machine, not as machinization of mathematics. While the book has, for half a millennium, been the dominant medium of storing and transmitting knowledge, the computer is able, for the first time, to process data as well. In 1999, Frankfurt Literaturhaus organized a conference on ‘book machines.’ On this occasion, Friedrich Kittler argued that analogue broadcast media, which are linear-sequential and base their storage on the principle of the tape, will be swallowed by the Internet. Books however, according Kittler, share with the computer the deep quality of being discrete media. That is why „Internet archaeology“ is necessary (Denis Scheck). But who is responsible for this kind of documentation? Classical archives and libraries do this kind of documentation only exceptionally; the new kind of memory might not be caught by institutions, but rather rhizomes within the net itself.

GL: Michel Foucault made a distinction between archeology and genealogy. Is that also useful within the media theory context? I have never heard about media genealogy. Do you have any idea what it could be? Would it be a useful term?

WE: It indeed makes sense to differentiate media archaeology and genealogy of media. Referring back to Friedrich Nietzsche´s Genealogy of Moral, Michel Foucault used the term “genealogy” to describe a cultural counter-memory, unfolding a different index, a different rhythm of temporality – mediated timing, it would say (the time of “time-based media”). Instead of looking for origins, genealogy looks for events at unexpected places and in unexpected moments without supposing individual agencies, teleology or finality. But the exact relation between genealogy and archaeology in Foucault’s work has been the source of much dispute or even confusion. With regard to media theory, let us put it this way: media archaeology is not a separate method of analysis from genealogy, but complementary with it. Genealogy examines process while archaeology examines the moment, however temporally extended that moment might be (reflecting “analogue” versus “digital” analysis). Genealogy offers us a processual perspective on the web of discourse, in contrast to an archaeological approach which provides us with a snapshot, a slice through the discursive nexus (as Phil Bevis, Michèle Cohen and Gavin Kendall once expressed).

GL: There is the often heard criticism of media archeology, in its obsessive search for the Laws of Media, ends up as cynical, technical determinism that glorifies scientists and the military, while explicitly fading out economical, political and cultural aspects. How do respond to such remarks? Do you see a debate here?

WE: You hit exactly at a recent, on-going dispute within the “Berlin School” of media studies itself. When we sat down to analyse the “branding” of our group, we realized that we are, from outward perspective, being reduced to hardware-maniac, assembler-devoted and anti-interface ascetics, fixed to a (military) history of media without regard to the present media culture (which is “software culture”, as described by Lev Manovich, and is moving from the computer to the Net, as expressed by Wolfgang Hagen). With my new chair in Media Theory at the Humboldt University, I want to care about a re-entry of economical, political and cultural aspects into this media-archaeological field—without giving up to Cultural Studies, though, which has neglected a precise analysis of technologies too much. In a couple of evenings, for example, some of our academics of media studies (like Stefan Heidenreich) went to an experimental media lab (Bootlab in Berlin), to discuss with non-university people who practice media theory (like Pit Schultz) topics like surfaces/interfaces, the aesthetics of programming, economics, ownership and copyright” and computer games. The next co-operative event might face media definitions and media terms itself.

GL: The popular management discourse of ‘knowledge management’ has no explicit references to archives. Instead, according to certain business gurus, knowledge is stored in people, in organizations, ever transforming networks, in let’s say ‘living’ entities rather than dead documents. In this hegemonic ideology knowledge only exists if it is up-to-date and can operate strategically, not hidden somewhere in a database. Only then it can be segmented as ‘intellectual property rights.’ How do you read this tendency?

WE: Intellectual property rights were in fact developed within the context of archives— libraries, to be exact; the legal notion of copyright was an effect of the need to protect authors and publishers against plagiarism around 1800. As to knowledge management, a current trend is the so-called “warehouse” approach which takes for granted that implicit knowledge is always already there in humans and in systems – just wainting to be excavated, triggered, extracted by agencies. I have a lot of sympathy for the trans-archival notion of ‘organizational’ instant memory. But leaving the neurological metaphors beside, this approach dissimulates the existence of material memory agencies – both hardware and institutions, which still govern the power of what can be stored legally and technically, and what will be forgotten. Let us, memory-politically, not underestimate the on-going impact of traditional paper archives or present audio-visual archives; the quest for access to such archives makes us feel immediately that they are still real. With digital archives, though, there is – in principle – no more delay between memory and the present, but the technical option of immediate feedback, turning every present data into archival entries and vice versa. The economy of timing becomes a short-circuit.

GL: Over the past few years you have worked in a research project on the history of Russian computing. Could you tell us something about the ‘mystery’ of Soviet cybernetics? It is well known that the strength of the Eastern bloc computer industry, military secrecy, also lead to its demise. I suppose it is wrong to state that this is a history of ‘losers’—but to some extend it is. There is some irony involved. How did the project deal with this? Did you stumble into interesting differences, compared to the US-led computer development?

WE: The genealogy of the computer and computing sciences as associated with names such as Charles Babbage, Alan Turing, Norbert Wiener, Heinz v. Foerster, Claude Shannon and John v. Neumann has been the object of an impressive number of publications in the German-speaking and Anglo-American areas, but this media-archaeology is reductive to the Western hemisphere. In general the historiography of computing is—even a decennium after the fall of the Iron Curtain—still blind in respect to Eastern Europe. The art of computing in the former Soviet Union, immediately after World War II, has developed some remarkable alternatives to Western machines which are attractive even today.

The alternative computing culture in the former Soviet Union has been stimulated by a weird and ever changing re-configuration between inventive improvisations on the engineering side and ingenious mathematics on the other, f. e. Viktor Glushkov’s idea of the “language-based” development of computers alongside with Sergey Lebedev’s more “electronic” approach. The activity in these directions was shared between the Kiev Institute of Cybernetics and the Institute of Precise Mechanics and Computer Engineering in Moscow. As a scientist, Sergey Lebedev was a professional and (maybe even more importantly), a ‘born’ electronic engineer, while Viktor Glushkov was primarily a mathematician more interested in cybernetic problems.

The paradox is, that exactly in a communist country, the material deficiencies in Hard- and Software because of the very absence of standardized mass production created highly original and most individual technical solutions. This flourishing pluralistic techno-culture though tragically came to an end when Moscow decided in 1972 to copy the IBM production line in order to get cheap software running. Promising efforts to combine Russian computing with German Siemens computing and the British ICL by a joint European venture collapsed since Walter Ulbricht as well had already decided for IBM standards in the GDR and convinced Breshnev in Moscow. With these decisions in the early 70s of the former century, not only the option for an independent European computer standard died, but I would call this as well the beginning of Decline and Fall of the Russian Empire in favor of what we today call the Microsoft global player.

One of the heroes of computing in the former Soviet-Union, professor Zinovy Rabinovich, told us during the recent Transmediale media arts festival in Berlin about the construction of the first “European” electronic computer in Kiev more than half a century ago (1948-1950). This computer architecture was developed independent from the von-Neumann-model, putting emphasis on parallel rather than sequential computing. Engineers and mathematicians, in the former Soviet Union, came together in ways different from the Western context – exactly because the mass-economic uses of computing were limited almost to zero, concentrating less on the universal than an special-purpose computers. With his 84 years, Rabinovish fervently argued to re-think the options of a European computer to fill the gaps left by the American model. Thus Rabinovich proved to be an “old European” (Rumsfeld) in the best sense. As an alternative to software versus hardware, he proposed his engineering philosophy under the name of “middleware” (though this sounded familiar to Western ears – we know it as micro-programming).

GL: The American cyber conservative George Gilder is a ‘storewidth’ guru who has been promising for decades infinite computer storage, unlimited bandwidth and computational power. For economic reasons Moore’s law may be out-of-order for a while, due to the implications of the ‘techwreck’. Yet, by and large, capacity has indeed risen incredibly. It is a society that cannot implement its own technological progress. What does that tell you, as a theorist who deals with archives?

WE: When the talk is about maximized computer memory capacities, this discourse still continues an old occidental obsession that culture depends on storage (historic architectures, libraries, museums). My media analysis tells me that the future cultural emphasis will be rather on permanent transfer, not storage (without undoing storage, though). There is already an implosion of storage mania into processual data flows, a different economy of the archive as dynamic agency “online”. The notion of immediate data feedback replaces the data separation that makes all the archival difference.

GL: German history, throughout the twentieth century, always occurs to me as incredibly well documented, which forms the basis for books, TV documentaries, exhibitions and museums. Despite war and destruction there is so much left that is still waiting to be classified and analyzed. Orderly file keeping has resulted in an overwhelming practice of detailed historical research. The Nazi period and the holocaust are of course well known examples. Communist East Germany has produced food for historians for many decades to come. One could also say that this is a guilt-driven industry. Hendryk Broder once used the phrase: “There is no business like Shoah business.” How do you look at the present storage-driven memory cult? This whole industry is obvious based on archives, and continuously creates new archives.

WE: My thesis is that the rhythm of historical memory is directed and triggered by the opening of formerly inaccessible archives and the waves of documents which then disseminate, feeding endless production of new texts and books. The Prussian system of State Archives (which became kind of a model for both the former Soviet Union and the US-American State Archives) has provided for a perfectly organized memory of official records in politics and culture. In the 20th century, a unique constellation hit the German archives: While normally state-related document stay classified for a long period of time, the collapse of the Nazi regime in 1945 led to the immediate opening of German State Archives (for the Nuremberg trials, f. e.) – a unique opportunity for historians and the public to know the archives almost in real-time, without the usual delay. At least two successive generations of Germans were then permanently confronted with this open archival evidence of war crimes, Nazi involvement of parents, etc. A similar event happened when the Berlin Wall came down in 1989: All of the sudden, the most secret archives of the former GDR State Security was open to the public, revealing the system of observation to the subjects immediately.

With Holocaust Memory in Germany, the case is different. A lot of what happened during this genocide is not only documented in files but also firmly fixed in the memory of the victims – or remains undocumented at all (for the victims who died). At the present, we are observing the transition of living memory (survivors) to mediated memory, which is fixed by paper or audiovisual records only to transmit it to the future.

One more word on the future archives in Germany. Post-war Germany (after 1945) had a dis-continuous relation to the past history of Germany; I myself, having grown in West Germany, remember that German history before 1945 was something alienated to me. Instead, the historical consciousness of the post-war generation in Germany that grew up with radio and television now coincides with its media archives – public broadcast archives that are not paper-based any more but exist in audio-visual form. The present and future problem is: How to get access to these new kind of archives in a non-proprietary mode? While the state always cared for public education manifested by the public libraries network and for memory agencies like the State Archives, the audio-visual memory of post-war Germany stays with companies that might sell these media-archives to private investors. Memory will be commodified; let us be political on this. There is a glance of hope, though: With the retro-conversion of analog magnetic tapes (radio, TV) to digital storage for preservation reasons, there will be different ways to hack into these digital memories since the digital archives, once online, are not separated from the “present” any more. In a way, of course, this means the disappearance of the emphatic notion of the “archive”; it dissolves into electronic circuits, data flow.

Wolfgang Ernst, Das Rumoren der Archive, Berlin: Merve Verlag, 2002.
See also: Wolfgang Ernst, Archive Phantasms, nettime, December 21, 2000.
http://amsterdam.nettime.org/Lists-Archives/nettime-l-0012/msg00115.html (English).
Wolfgang Ernst, M.edium F.oucault. Weimarer Vorlesungen über Archive, Archäologie, Monumente und Medien, Weimar: Verlag & Datenbank für Geisteswissenschaften, 2000.
Georg Trogemann / Alexander Nitussov / W. E. (eds.), Computing in Russia. The history of computer devices and information technology revealed, Braunschweig: Vieweg Verlag, 2001.
Forthcoming: Wolfgang Ernst, Im Namen von Geschichte: Sammeln – Speichern – (Er)Zählen. Infrastrukturelle Konfigurationen des deutschen Gedächtnisses (1806 bis an die Grenzen zur mechanischen Datenverarbeitung) , Munich: Fink, 2003