On Paratactics, Bots and Post-Colonial Finance – A Muhabbet between Ebru Yetiskin and Geert Lovink

Original on the ISTS website: https://stsistanbul.org/2018/08/30/on-paratactics-bots-and-post-colonial-finance-a-muhabbet-between-ebru-yetiskin-and-geert-lovink/#more-777

In 2017 Ebru Yetiskin, a member of IslanbuLab who is also an Istanbul- based sociologist and an independent curator, visited the Institute of Network Cultures in Amsterdam and met Geert Lovink, who works there as a researcher. A while later, Geert Lovink’s 2016 book Social Media Abyss came out in a Turkish translation with Otonom publishers. In April 2018 it was time for a second encounter, this time in Istanbul, where Ebru Yetiskin organized the book launch of Social Media Abyss at the Akbank Cultural Centre as the inauguration event of Science, Technology and Society Talks, a public conversation series run by IstanbuLab. The book tour was initiated by Andreas Treske and Ahmet Gurata (Bilkent University/Ankara). After an update on the Social Media Question by Geert Lovink for the hundred people who turned up that Saturday afternoon, a conversation took place, based on the questions of Ebru Yetiskin, followed by a discussion with the audience. The dialogue below is neither a reconstruction nor a transcript of the event but a follow-up e-mail exchange between Yetiskin and Lovink which opens up a different type of conversation: muhabbet.

This online INC-STS muhabbet, friendly but critically engaging conversation that took place between Yetiskin and Lovink will walk you through the stages of identifying the problems, threats, and possibilities that media technologies have brought to the table of not only users but also social science researchers who are mostly outside the current modes of technical knowledge production. It is a frank discussion on the biases of codes and algorithms upon which social media platforms have been built, the lack of technological awareness in academia, and the effect of this lack in terms of centralizing governance structures. Yetiskin and Lovink not only elaborate on the embeddedness of the political, economic and cultural in the technical but also the embeddedness of technical means in the way that we think and express ourselves. They also discuss the ways of politicizing the contemporary technologies, making them the matter of concern for both the users and the social scientists, and opening the black boxes of them i.e. making visible the invisible: media infrastructures, biased filters, the firewalls against the democratization of technologies. Enjoy the INC-STS muhabbet!

Ebru Yetiskin: In Social Media Abyss, you ask: what’s the social in social media? Such a useful question to begin with. According to you, the social organizes the self as a techno-cultural entity, a special effect of software. Real-time feedback features prove addictive for many users. However, in today’s Internet debate, the social makes no reference to the Social Question, nor does it contain any hidden reminder of socialist thinking or socialism as a political program. Programmers glue everything together, connecting users to data objects and users as data objects. You say, “that’s the social today.”

Geert Lovink: We’re acutely aware that the “pathologies of social media” are a product of an époque-specific standard. In terms of “class composition,” the apps express the anxieties and desires of the precariat, as described by Alex Foti. The social type would be the walking multi-tasking urban dweller looking down at her/his smartphone, always elsewhere. However, there’s not yet a catchy name for this Gestalt. The technical network structures are synonymous with our social life. The idea that there is a so-called ‘real’ social life outside or beside the networked devices is questionable. I’m following Axel Honneth here. What happens when society falls short of the ‘objectively’ already possible rationality? To translate this into our context: what happens when the objective development of network technologies fails to produce decentralized, federated structures and instead regresses into centralized power structures aka early 20th-century monopoly capitalism? Should we consider social media as “organized self-realization”? — This is a short exercise to prove that social research can be so much more than the Data Question. Part of the quest will be to link the growing social inequality with the Technical Question.

Ebru Yetiskin: Tough, the majority of social scientists, humanities scholars, activists, artists, and curators are still software-illiterate. This is a problematic issue for transdisciplinary research and critical knowledge production about contemporary digital cultures. Both social sciences and humanities lack technical knowledge. Niklas Luhmann once remarked what happens when those interested in theory return to the classical authors: their task becomes one of dissecting, criticizing, and recombining already existing texts. This type of knowledge/power production in the social sciences and humanities seems to be a disowned and covered-up conservatism within the academy. Yes, the few technical approaches in the humanities come from ‘software studies’, an emerging field that tries to understand the role of algorithms and bots in contemporary media power. But it doesn’t change the fact that we have a huge knowledge gap in social research.

Geert Lovink: Will it help if we make a proper diagnosis of this gap? What’s the use and abuse of our own technology fable? As a true Kittlerian, I believe that our technical writing tools are formative in our thinking. However, the Zeitgeist has been neglecting this basic insight for many decades. With each generation entering the scholarly stage I became hopeful that change might be on the horizon. Such a self-deception… It was perhaps our generation that failed to implement crucial educational reforms in terms of media literacy and coding skills. Computers and their networks have not been democratized. Instead, we’ve been drugged with the comfort of centralized cloud services. We can expect conservatives, nationalists, and neo-liberals to embrace Facebook and academia.edu. However, critical urban scholars, feminists, cultural studies types and post-colonial scholars should know better. While technology further accelerates, the overall intellectual climate becomes insulated and backward-looking, as Luhmann indicated. How do you explain the blatant lack of tech awareness in academia?

Ebru Yetiskin: This also concerns our understanding of techno-science. If technology is not understood as a way of organization, in which the production of technological infrastructure embodies political, economic and cultural biases, it will continue to be considered mainly with its various impacts. The lack of techno-science awareness in critical studies then becomes a facilitator, an accelerator or a mediator for the reproduction and control of dominant knowledge/power nexuses. We need to think about how we can build a dialogue as well as a research infrastructure with software programmers and coders because most are not aware of conceptual distinctions within their makings. Those that have critical knowledge are also faced with making coercive (dualistic) categorizations in their practice. In order to change how things work and how artifacts are produced, we need to explore emerging ways of education and collaboration.

You argue that we should go beyond the pitiful bourgeois defense of ‘liberal arts’ and ‘humanities’ and demonstrate that there is no software without concepts, and no concepts without mediation. If we fail, the techno-scientific knowledge production will be dominated by the values of market/state/platform players.

We lack voices of technically attuned public intellectuals. Even if they speak, they are not well heard by the decision makers. The conception of the social has shifted but it is not well acknowledged in public. The Social Question, the object of study and the matters of concern in social sciences and humanities have disappeared and or mutated. Can you elaborate more on what you refer as “hermeneutics crisis”?

Geert Lovink: Thinking expresses itself through technical means. There is no pure contemplation. Arguably, it has always been that like. The capture of thoughts, dialogue, and information has always run through material channels and interfaces. The problem we need to address is one of fake comfort. We’re drifting away from the issues and enter a collective state of amnesia.

Let’s politicize the current modes of knowledge production. Complex invisible databases, algorithms and search engines define our worldview. They define what’s news and what’s knowledge. In response, we should not demand that these filters become ‘unbiased’. Instead, we should raise our own technical knowledge and that of the coming generations. We should demand the ‘black boxes’ to be opened. Today’s social media platforms are too big and too closed for anyone to do proper independent research on them outside of existing proprietary organizations of data. It‘s one thing to formulate a ‘black box’ theory to study the algorithmic cultures of such social networking websites—but what happens if the algorithms indeed remain a black box for decades to come?

This issue will only accelerate once we get used to talking to devices and take the pseudo-human feedback of AI systems for granted. Who’s programming and controlling the bots? Is it an illusion to ever get to the bottom? How realistic is this Will to Deconstruct in a technical age in which even experts only have temporal partial knowledge? Isn’t the urge to construct the new and better answer? We have not yet seen things such as Arbeit am Plattform à la Hans Blumenberg. Apart from Benjamin Bratton, who works on his higher-level concept of The Stack, where are our fellow philosophers? 3.5 billion people walk around this planet with a smartphone. That’s not a marginal issue. We cannot even start to address the related issues, as we do not have an academic and intellectual platform for this. In order to get there, we need to not only create an urgency like: look how important this all is! What do we demand? What’s our Question of the Faculties? How do we want (higher) education to be reformed? Is it all just a matter of ‘media literacy’? Internet culture is what defines this planetary age, yet our field as marginal as ever. Internet, IT and the cloud are not media, let alone tools. How can we convince philosophers, sociologists that this is a primary planetary event?

You studied bots much more than I have. AI is a secondary level that is related to automation and the elimination of work, not directly related to dynamic networks, which I consider a hybrid of human and tech. Bots simulate. AI and algorithms are computer science topics that perfectly exist outside of the messy outside world of networks. They need neither us nor connected computers. For AI and algorithms networks and humans are sources of ‘big data’, mere input. That’s why bots to me are boring. Are they for you?

Ebru Yetiskin: No, machine-learning algorithms are not boring at all. They bring up interesting questions and speculative possibilities. They make us rethink the basics. How can we describe the ways of seeing, hearing, acting and knowing today? What are the basic rights? While talking about smartness and intelligence, how can we reflect on various functions of artificial stupidity, absurdity, and failure? While curating Bager Akbay’s Deniz Yılmaz, a machine learning algorithm, which is based on a narrative of a bot to be recognized both as an artist and as a citizen in Turkey, I have learned that we, humans and non-humans, can collaborate and demand for the black boxes to be opened collectively. By way of the distributed actions of hybrid entities, it becomes possible to explore the inner workings of algorithmic governmentality.

For this, we would better explore how algorithmic architecture, neural networks, and deep learning are being developed for various uses, such as for commons, as David Bollier put it. Today, governments, corporations, and platforms use them. Their protocols control and obfuscate how data is collected, shared and used. This cut from open data and knowledge production is a wall, rather a firewall, against collective thinking and democratic participation in decision-making mechanisms. To deal with this obstacle though, we need to shift the perspective away from the dominant view that is locked into the privilege and autonomy of Human and Humanity. I don’t think social scientists and humanities scholars have acknowledged the importance of hybrid entities yet. It is probably because they lack imagination. They cannot know ‘how’ to imagine—again.

As an initial step, rather than equality, I suggest we should call for the rethinking of “equivalence” of entities. Along with blockchain based works, today we are at an early stage of developing algorithmic agencies, such as Terra0, HARVEST, Kitty AI, The Shepherdand Plantoid, that can control and manage capital autonomously. The slower we learn how to use and work with algorithmic agencies, the faster market, governments, and platforms govern and control them/us.

Look how the Facebook media spectacle, publicly known as the Cambridge Analytica scandal, emerged as a political instrument for the public understanding of techno-capitalist power production. So far, millions of users were not aware of algorithmic governance. Looking at Zuckerberg’s responses during the U.S. Senate hearing about Facebook intention to improve their platform by machine learning algorithms, it is not hard to predict the upcoming regulations and surveillance developed by the result of a close collaboration of governments and such platforms. In this way, the nation states can eventually increase their sanction power to control data-capital market as well. Via the live broadcast, the public is informed about the reasoning of developing more severe and totalitarian practices in the near future. Who benefits from this, taken into consideration that we are so slow to understand this governance model?

Geert Lovink: We may be slow, but can we perhaps accelerate our critical understanding? This can be done by taking the ‘alternatives’ shortcut and build ‘parallel’ utopian structures that are based on radically different communication premises. I don’t see this happening through a nationalization or break-up of the platforms. The push for ‘regulation’ is understandable—and should be supported—with the aim to put corporate platforms in the defensive. Closing down social media services and replacing them with sanitized national versions might work in authoritarian countries, but is this the way to go? Where do you see openings? In artworks? Policy proposals? Starting our own think-tanks?

Ebru Yetiskin: Let me begin with what you refer as “authoritarian” countries. We need to update our thinking about the various complex ways in which censorship and exploitation are executed today. Algorithmic governance is authoritarian as of its infrastructure and it is executed more or less in a similar fashion according to different local circumstances. Different versions of the same program are downloaded and run in different countries. We need to think and act transnationally. Hence derivatives of censorship in various transnational contexts require critical attention today: How about exploring the emerging ways of exploitation and censorship, which are not (yet) recognized as a problem? Ecological censorship, algorithmic censorship, gender censorship… Let us not stay away from the problem by linking censorship and exploitation merely with the so-called “authoritarian” nations and adopt an orientalist position.

Comparative analysis can be useful to enhance our critical understanding about the reproduction and exploitation of techno-capitalist authority. Otherwise, we wouldn’t know what and how to change/build, right?

The political-economic control of techno-scientific controversies confines the emergence of open governance. It prioritizes the interests of knowledge/power elites. This is also a derivative of censorship because users do not fully understand how social media or any other kind of technology is organized, developed and effects life on Earth. Not only in digital cultures, but also at a planetary scale… As a result, the lack of knowledge and action capacity limits the potentials of intervention and leads to the development of techno-science in restricted networks of techno-capitalists. As you also reminded, the collected stack of data (capital) is in the hands of governments, corporations, and platforms and it is closed for critical analysis, imagination and action.

You also mentioned building parallel utopian structures. Let us think about the control of political imagination by utopic/dystopic futures suggested by the creative industry of design fiction, such as Google’s Creative Lab. Techno-capitalist ideology is distributed and popularized by the cultural production of this industry. “Robots will either make us unemployed or they will save the world!” Interactive exhibitions, product demonstrations, behind-the-scenes consulting work and prototypes are formed by a loose network of artists, scholars, engineers, and designers who are shaping public understanding of technology and society within such utopic/dystopic future discourse. Design fiction industry suspends belief and disbelief about the ways the world is changing and its cultural products demonstrate the futures to come by creating a popular culture. In this way, it becomes possible to regulate, manage and control the workings of action and imagination towards the goals of techno-capitalist investments.

We need to make clear distinctions and conceptualize contemporary forms of cultural production. It’s important to make a distinction between ‘alternatives’ and ‘parallels’. Instead of adopting utopic/dystopic, revolutionary/destructive dualisms by suggesting alternatives to be adopted by the majority, we should tactically work towards forming and multiplying minor equivalents and counterparts by dynamic collaboration collectives. Is that what you mean?

Geert Lovink: We can lament our marginal position as artists, activists, and critics in society but this is not quite the case in our field of Internet and technology. We’re also educators, curators, opinion makers and even more important: our ideas inevitably get will get translated into code and interface designs, implemented by start-ups or IT departments of larger firms. Maybe we’re not in command when it comes to ‘culture’ and traditional media such as radio and TV. But we should not underestimate that technology has a long-term indirect influence on society, for instance in term of information flows and human behavior. We should become more aware, and more confident, about our collective concept-making (and shaping) abilities. We are reshaping society and the powers to be do not like that as we often withdraw from official forms of representation. We consciously defer energies and avoid tiresome open conflicts. The question then becomes if this long-term undermining will not, inevitably, burst out into a large-scale open conflict.

Ebru Yetiskin: As Tarleton Gillespie emphasizes in “The Relevance of Algorithms” algorithms need not be software. We need to explore how state and market actors adopt algorithmic governmentality. By coding, the capture apparatus of the state controls the capacity of non-state actors to position, determine, intercept and model within its own program. To achieve its goals, the state also secures the gestures, behaviors, opinions, and discourses that have a potential to create a threat or a risk.

Let me come back to my point about the updated versions of censorship because this also reveals how algorithms work and govern. In the last few years, the word ‘parallel’ has been decoded and highly politicized by the state in Turkey. Today this word has become a code for a parallel state structure, which refers to a terrorist organization. If a social science and humanities scholar or any cultural worker use this word within a popular political discourse, s/he is decoded and categorized within the blacklist of the state as a (potential) terrorist to be dismissed, arrested and dispossessed. Thus those who work on revealing the hidden operations and suggesting ways of action extra-logical to (the codes of) the dominant are either ignored or not considered as a surplus value. I have to underline the fact that I am not giving reasoning for auto-censorship.

In the name of resistance and critical analysis, mechanical repetition of existing forms by texts, concepts, actions, and methods appears to be a safe, user-friendly and conformist selection for a great number of people. I’d say: why not create new concepts? Let’s cause a temporary autonomous flow and decode both state and market. Stars have become extinct in academia. Why repeat without searching for another way of thinking? I am following Deleuze here, who said in 1968 in Difference and Repetition: “We do not repeat because we repress fear, disgust, and anxiety. We repress and forget because we repeat.”

In our era, one of the problems is that we forget how we (are coerced to) repeat (by consent) existing concepts/thinking of star theorists. This conservative knowledge production problem emerges as a derivative of censorship as well. The systemic functionality of stupidity, as Bernard Stiegler puts it in States of Shock, requires more of our attention as an object for critical analysis of contemporary knowledge and power production today. What do you think about the systematic use of stupidity within the current state of social sciences and humanities?

Geert Lovink: Concerning stupidity, I have been influenced by Avital Ronnell, who wrote a study about it, and Matthijs van Boxsel, a Dutch genius who has been working on his Encyclopedia of Stupidity for the past thirty years. Nicholas Carr’s famous 2008 essay in The Atlantic “Is Google Making Us Stupid?” arguably kicked off the 2008 wave of American internet criticism that continues to this day. The stupidity notion is rarely used in the internet context. Rather, people speak about distraction by design, online addiction, information overload, depression, and burnouts. The recent rise of ‘fake news’ might have an implicit reference to the argument which claims that the infotainment and clickbait make people stupid and keeps them ignorant. The decline of the Western youth can be noticed on many levels, but this was already happening when I grew in the 1960s: we know less by heart, read less, no longer read the classics and so on. Perhaps a tiny elite still provides its kids with such an education but that’s no longer a significant—and visible—group. But is the mainstream really all that more stupid? Are they still able to ask the hard questions? Is their so-called increased speed in terms of media literacy compensating for the decreased ability to remember facts? Maybe we should forget all these non-sensical considerations and instead focus on the issue of indifference. Here I am referring to the shift that we see from I do not know to I do not care.

Ebru Yetiskin: In Social Media Abyss you emphasize that an IT-informed post-colonial theory of embodied networks and organization is long overdue. On the other hand, in a 2016 United Nations white paper, “How Can Cryptocurrency and Blockchain Technology Play a Role in Building Social and Financial Solidarity?” financial anthropologist and blockchain activist Brett Scott argues that the bitcoin community has a tendency towards “techno-colonial solutionism” and “techno-libertarian evangelism” which propose the digital currency as a solution to issues in the developing world. He states, “[…] escaping weak local institutions might help individual people, but does little to empower the broader social majority who remain reliant on the existing systems. Those who are most likely to seek escape are social elites with high education with access to technology and capital to protect them. The rhetoric of cryptocurrency superiority—often articulated by cryptocurrency start-ups—even has neocolonial traces.” Local elites in fragile countries are encouraged to buy into a “forget your local systems, rely on our technology” narrative. Social elites with high education, access to technology and capital become mediators of such ideology to evangelize libertarian political ideals, promoting profitable ‘solutionism’ from above.

Postcolonial theory is long overdue because it considers The Social Question based on identity-oriented and human-centered notions rather than the equivalence of human and non-humans. The exploitation of humans is no different from the exploitation of forests, bees, robots, rivers, and minerals. How can we rethink colonialism so that we can trace the ways of action of those that are exploited?

Geert Lovink: One obvious way to do this would be to quantify all products and services so that we can easily find out where they are from, who produced them, including the breakdown who earned what percentage of the price you paid for it. However, I am not sure if this ‘awareness through data’ is going to have a lasting impact. It may as well lead to the next level of indifference. In my view, post-colonial theory should move away from identity politics and moral policing and formulate much radical programs such as education, fair trade and ultimately a global redistribution of wealth (which goes way further than the demand for compensation). Let’s add to your broad economics an element of financialization. Let’s integrate the radical agendas of initiatives such as MoneyLab, The Economic Space Agency and Commonfare with the potential of a sovereign ‘post-development’ aka ‘post-globalization’ agenda of the former Global South (as already suggested here and there). A first gesture should be to launch post-colonial as an economic program, and not merely see it as an attitude or gesture.

Ebru Yetiskin: Although you are well known as one of the pioneers of tactical media, surprisingly, in Social Media Abyss, you refer to strategies more than tactics and the tactical. We see the word strategy being used or referred to in 46 pages whereas tactic related issues are mentioned in only 15 pages. Of course, numbers do not always speak, but I’d like to question your focus on strategies. You write: “In the post-Snowden age, it is no longer sufficient to call for open source alternatives that merely copy the corporate premises of the dominant platforms.” Since 2012, together with Amber Platform, I’ve been researching what we conceptualize as paratactical works.

Geert Lovink: Can you say more about this concept of the ‘paratactical’? Is it similar to post-digital? Is it ‘new media arts’ that no longer identifies with its own label and rebels against the confinements of its own ghetto?

Ebru Yetiskin: Tactical media worked fine in the late 1990s and early 2000s but today tactical media activism, such as flash mobs, memes, and hoaxes, evolve into popular entertainment, marketing and propaganda tools of governments, platforms and corporations. If dominant power structures appropriated tactics, what else could we do? That was the fundamental provocation for us. As –para, taken as a prefix, means alongside, beside or beyond, paratactic refers to imaginative collaborative actions and interventions alongside, beside or beyond tactic and tactical media activisms. Paratactic works do not suggest alternatives or substitutes but they demonstrate how other things can be realized by using similar tools – in parallel.

Paratactical works re/use and reveal the tactical/hidden operations of corporations/governments/platforms and subvert invisible/algorithmic domination mechanisms. They repeat, learn and participate by DIY & P2P analogue low-tech cultures, design collaborative networks for the interaction and transaction of values and produce knowledge.

By exploring and reusing the tactical operational processes of algorithmic governmentality, paratactic media works realize subversive actions and creative interventions, which do not only represent and imitate the task-specific nature of algorithmic systems but also produce performative and intervening compositions against ignorance, stupidity, extinction, degeneration, corruption, and destruction. Rather than being merely infected and pacified by the conditions and predications of its medium, paratactic media works produce background information about the medium as such, its obfuscated operational process, its users and their patterns of action.

Collaborating with politically minded artists, programmers, architects, makers, scholars, designers, engineers and algorithmic agencies, curating transdisciplinary research became a critical tool for the development of paratactical works. This conversation is also a part of such research. Curating should not be considered merely in a narrow sense here. It is about caring, curing and creating ways for building critical examination, collective thinking, and collaborative actions. It opens temporary autonomous zones for speculative imaginations.

As we have experienced in Burak Arıkan’s Networks of Dispossession, I believe we can make things visible and create a critical mass to exhaust the bourgeois mind with a never-ending stream of revelations, so that we can challenge the dominating conditions of knowledge/power production and lack of imagination.

One of your other concerns is about the development of alternative forms of money and finance outside of the mainstream banking system as a possible answer to the current financial crisis. The latest ubiquitous technology is financialization itself. Debt strike/forgiveness projects do not question the dominant definition of money and how it functions. Most users now understand the cynical logic of the free in which they’re caught. You say that this is the age of monetary experimentation. Alternatives are useful mirrors through which we can study the technics of the mainstream. Micro-credits and barter, crowdfunding, peer-to-peer (P2P) banking, time-banks, mobile money, and crypto-currencies are examples of these parallel strategies. How do you think these financial strategies can be positioned in relation to the broader critique of global finance, and also is it possible to operate autonomous systems outside of the influence of national banks, fiat currencies, and credit card companies?

Geert Lovink: The dying political left has been caught in grand statements about the political, that we have to aim for the entire system—not just for a piece of the cake. This has resulted into a top-down policy-driven agenda, defined by (invisible) professional experts that are unapproachable, surrounded by NGOs and think tanks that work on the level of socio-economic issues. In this fast-moving, disruptive, social media-driven world, that dull meta-level policy world is no longer backed up by social movements (as still was the case 15, 20, 25 years ago). There is no alter-globalization movement anymore. This is particularly visible in the arena of global finance. Remember the resistance against IMF and World Bank in Latin America in the 1980s? What’s left is DIEM25, the movement-turned-political-party that grew out of the Euro crisis in Greece, a decade ago, which I support.

The austerity doctrine is as strong as ever, everywhere you go. Privatization and financialization rule. By now, these measures have been identified as ideologies. But that insight has so far not lead to a neo-liberal legitimacy crisis. What worries me most, in response to what you raised, is the fact that experiments to develop alternative or additional forms of money are all in the hands of an aggressive right-wing libertarian technical class of geek-entrepreneurs. There is hardly a counter-movement (we try to organize that with MoneyLab). It’s not sufficient to say that it is all a bubble, a ponzi-scheme about to implode. I do not believe it is sufficient to take control of the national bank and the ministry of finance. Money’s gone digital, it’s gone to heaven.

Ebru Yetiskin: Working with Smari McCarthy’s Engineering Our Way Out of Fascism, you discuss contemporary fascism as “the perfect union of state and business.” You argue, “today’s questions of political organization are technological in nature.” As we’ve recently experienced, blockchain apparently facilitated elections in Sierra Leone. With the upcoming presidential elections in Turkey in June, let’s discuss the use of blockchain in the making of fair elections. The Swiss-based blockchain startup Agora obtained permission from the National Electoral Commission to act as “an international observer” at 280 of roughly 11000 polling stations. Sierra Leone election officials recorded the paper votes as they would in any other election. Then, Agora’s team recorded those same votes on their blockchain. Later it published the results on their website. As it turned out, Agora’s involvement with the Sierra Leone election was merely a ‘proof-of-concept’ experiment. In other words, they proved that they can record an election and get the same result as government officials. It wasn’t a ‘blockchain election.’ It was an experiment in market research. Having said that, would it be possible to control the safety of elections, not by a private company but by a collective of non-profit civil society organizations, such as Vote and Beyond in Turkey? How can peer-to-peer solidarity be more effective in direct political action?

Geert Lovink: I am particularly proud of the early Dutch hackers’ campaigns against electronic voting machines. Before we continue this debate, let’s make sure that we have ‘our’ hackers on board to permanently test these systems—even the ones that ‘we’ design and promote. Counting votes is one. Constitutional reform would be another. Laws to reign in campaign financing would be another (think of the work Lawrence Lessig did in 2015-2016).

I am not concerned with ‘fake news’. This is a non-issue for me, as the whole debate leaves out the role of think tanks, big donors, advertorials, ministries, secret services, PR and marketing firms in the ‘agenda setting’ process. Fake news from outside ‘disrupts’ their business-as-usual and this is why the political class is so nervous about it. Instead of talking about others, let’s focus on our role: how can we set the election agenda? Where are our forces of change? Why bother about fair elections if we’re not even playing a role in that game? Let’s first build a strong and diverse movement and then discuss our employment of the blockchain. Let’s not start there. First of all, blockchain needs to decoupled from the money aspect so that it can show its full strength as a secure, distributed database.

Ebru Yetiskin: In order to start from there, we have to deal with a serious problem.  How can critical creative artists/scholars/activists/geeks survive? How do we generate value now, in an economy that is designed against us? After the copyright regime lost its legitimacy, ages ago, how will creative workers make a living? The Personal is Financial. How can we shape the art of living by critically using technology?

Geert Lovink: This is the MoneyLab question. For us this became urgent in the Dutch austerity years of 2011-2013, when the cultural budget was cut in half and the cracks in the ‘economy of the free’ became visible. We believe in parallel experiments in order to create additional money flows. A structural solution may come from the ‘universal basic income’. I enjoyed that from 1984-1992 and it made possible all the work that I did from the 1990s till now. What happens to talent when housing, health care and daily costs such as clothing and food are taken care of? I am in favor of a massive intervention in this debate from the side of artists. The current art market does not work for the vast majority of them. They are not even remotely participating in the global economy of collectors, curators, and galleries. Add to this the reality that many artworks are now digital and there you have a systemic crisis that only gets worse. That’s the analysis. Our alternatives are not there yet. We only have the collective imagination to think out loud, conduct our money experiment and take back the discursive space from the right-wing libertarians.

Ebru Yetiskin: Contemporary art market actors are faster than the independent cultural workers in using blockchain for the reproduction of their authority. Meanwhile, we are dealing with tech-ignorance, software illiteracy and the vanish of humanities. It will also be much easier to track and control the works of digital artists by blockchain-based platforms. Although much emphasis is given to surveillance and punishment, you emphasize the importance of Foucault’s later work on the ethical care of the self. How to shape the ‘art of living’ with so much going on simultaneously? How to minimize domination and shape the new technologies of the self? Any good examples of recent work that you might want to share with us on this?

Geert Lovink: Let’s not be pedagogical and stay close to the messy every day of the connected billions. Let’s reduce complexity and start to build tools, apps and platforms that are user-friendly and not focused on Western male geek requirements. The eco-feminist, post-colonial critique of technology should urgently move to the next stage and start building applications, programming languages, smartphones. Why not? Ever since the experiences in Amsterdam with Fairphone, we can see it’s possible. From a cultural/gender/post-colonial perspective the free software and open source movement has proven to be a dead end street, a complete and utter disaster. However, we cannot simply walk away from it all. We do not need more women etc. involved, the entire industry needs to be rebuilt, from scratch.

Ebru Yetiskin: It’s also critical for me to work with those that build stuff so that we can learn from each other about how to govern autonomously and act collectively, most likely in a distributed manner. Building a collective curatorial research paratactically is a way of staying close to the messy everyday of connected billions.

Share