Interview with Geert Lovink by Taras Nazaruk
This conversation was recorded online on March 14, 2018 at the very beginning of the Facebook Cambridge Analytica scandal. It was recently published, in Ukrainian by cultural magazine Korydor: http://www.korydor.in.ua/ua/opinions/geert-lovink-pomylky-majut-buty-vepravleni.html. I later edited and shortened the original text in English.
TN: In preparation for this conversation I read different materials of you, including your Twitter posts. One of the tweets laid out three steps to fix the Internet: separate the social from news, empower users with collaborative tools and merge protocols with p2p revenue channels. Is the internet broken?
GL: The phrase “the internet is broken” has a history and became prominent in June 2013 with Edward Snowden’s revelations. For my generation, this came as a shock. Not because of the content but because of the realization that so many things we were suspecting were suddenly right in our face. It was the shock of the evidence, which differs from the shock of the new. Or the traumatic shock. The revelations did not come as a relief as in: “I told you so this was the case”, when you feel that the Truth is on your side. The Snowden evidence was more sinister, casting a long shadow into the future. It is the realization that all movements, words, clicks, digital or not, can and will be recorded, even if have no clue. Earlier on, my generation believed that engineers, the people who are in charge of software and infrastructure, who built the internet and maintain it, are in charge. In short, that the Internet is a human construction. In that case one can understand it, take it apart, deconstruct it and understand it how it works. This world view was shattered.
Until recently, control happened through panoptic forms of visible control that were internalized. Surveillance cameras were visible. The fact that we can’t really see them anymore leads to a culture of general uncertainty. The invisible, microscopic methods that Snowdon revealed make it much harder for us to understand the ideological premises. We’re thrown back into Plato’s cave, condemned to stare at mesmerizing interfaces. What Snowden proved was that there are much larger forces at play that we don’t see and operate at miniscule levels. Even tiny parts that can be inserted inside a USB stick, something we would never think of or notice. These surveillance systems are completely opposite to old school surveillance a la Orwell in which visible elements like “we are watching over you” always has to be manifested. Deleuze did not anticipate this either in his notion of ‘control society’. In the past power put in a considerable effort to show that it was in control. How do new forms of self-control look like? Related to this is the tactical question: will we be able to trace these devices, this software, with our own counter-instruments? Should we massively ignore the invisible collection of our behavior and social life?
When we’re saying that the Internet is broken, we are talking about this new culture of uncertainty in which you have no idea really how and who is going to use the data we generate. We can only survive the digital era if we can ignore it and overcome its ephemeral nature. If we know that everything will be stored and will be used against you, we stop living. I am not sure if the autonomous pathos of ‘smashing’ the ‘technological violence’ (as theorized by Detlef Hartmann and his followers) is a desirable strategy. I much more prefer the Italian credo that we are ‘incalculable’. Algorithms cannot be besieged. Instead, we should prove that their ethical and cultural value is zero, that human ingenuity is way more cleaver and complex. Algorithms should be rendered useless. In that sense I do not believe in the distinction between good and bad algorithms.
TN: The materials Snowden published were collected by governments.
GL: I disagree. Look at the evidence that Snowden came up with. There is such high level involvement of companies like Google, Facebook, Apple and so on. And they are complicit and intertwined with ‘surveillance capitalism’ (Shoshana Zuboff). This is a new form of what, in the past, was called the “military industrial system”. We now have a surveillance capitalist system – and it works similar to military one. There’s no way to separate the state from big IT players.
TN: Are people aware that data centres owned by Google and Facebook are used to . Do people agree for this or they just don’t know?
GL: There are different degrees of awareness. We can say that five years ago most Facebook users had little or no idea. These days people have at least an idea that what is going on. We don’t understand the precise way algorithms work because they’re closed objects, we can’t study them (this is what Frank Pasquale called ‘black box society’). We have no access to the software that tracks us. However, we are developing a sense of what they do. And lot of people these days are collecting evidence. We could call this ‘algorithmic governance from below’.
An example. Perhaps you’re aware that in metropolitan areas across the globe Chinese rental bikes of rival companies are being dumped on the streets. The containers arrived overnight and cities were flooded. A little later people realized what business model behind this was. It is not that these companies are renting out these bikes and make a profit from the rent of each of these bikes. No, these companies collect the data of the people who use these bikes to study how long they drive around, where they go and collect all these data in profiles where they are matched with profile data such as gender, age, financial profile, profession, living area etc. The app the bikers installed is a huge data collection machinery with the aim to sell these data to third parties. On the side of academia and civil society we finally see progress in the critical understanding how these new data regimes operate.
TN: Facebook users finally start to realize that their data is being collected. However, they still use the service, knowing that their private information is stored in data centres. Is it some kind of agreement that people intentionally do this? Why do people still agree to use Facebook when they know that they don’t control their data anymore?
GL: This is the classic problem Slavoj Zizek deals with. We know Facebook is bad, yet continue to use it. People thus bring the idea of ‘false consciousness’ to a higher level. We should apply this notion to the world of social media. Users know their data is being collected and that this potentially will be used against them. We should add to this that Facebook and Google are large advertising companies. 85% of their revenue comes from advertising. So we should look at this all phenomena from that perspective. They are in the business of selling us advertisement and products. This is the cost of the Silicon Valley model, the economy of the free where I can have their services for free and in exchange giving data and me being exposed to ads. That the social contract of 21 century, which is not between citizens and the state but between users and social media platforms.
TN: Speaking about public awareness and social agreement between company and user… Will the recent Cambridge Analytica case influence people’s attitude?
GL: What CA shows is the techniques of micro targeting, based on detailed awareness what ‘s going on in certain populations, parts of the city, even up to the street level. If there are sentiments one can read, quantify and analyse, you can then start to manipulate them. You can serve certain areas with other types of information in a way that no media has been able to do before. Imagine in Ukraine you can feed one specific part of town with Russian propaganda and other part with anti-Russian education. This is a completely amoral operation because, in the end, the customer is paying. If you want civil war we can produce one for you. After the info-war, we’ll be sending weapons and they soon after they will start killing each other. It’s not all that hard. That’s the problem. There is no general information anymore. Everything is contextualized, up to the level of the individual user.
TN: Last year the Ukrainian government banned social networks like Vkontakte or Odnoklasniki as they considered them a threat to national security because those networks were coming from Russia. This was one of the measures to protect Ukraine (according to the government). There were different responses as VKontakte or Odnoklasniki were among the most popular social media in Ukraine. After that measure Facebook started growing.
GL: The same can happen to Facebook as well. One of the problems in America is that Zuckerberg himself who was a supporter of Hillary Clinton, and ironically the analytic tools of his company helped Trump to win the election. How contradictory is that? To use the old school readings of what in the past was called objective interest. Is it in the objective interest of Facebook to share data with whoever? The underlying reason for this is the automation of all these processes. Facebook consciously only has a handful of employees. They have delegated all these sensitive issues to their software protocols and then, retrospectively, apologize for their mishaps.
TN: China’s social credit system, when many private and actually every citizen has its own social credit rating based on his activities online as well and it is controlled t. And this information is provided by private companies. Government builds on its basis ranking of its citizens. Can we say that this is the way government use technology and private enterprises in order to build this society based on social credit system?
GL: Yes, but the situation is not so clear outside of China. It’s the question whether Russia, Iran, Turkey and other countries we could call authoritarian states will follow the Chinese example. Maybe they will prefer another form of control, and hegemony, to use the term of Antonio Gramsci. The Chinese model is oppressive and pedagogical. Maybe the Russian state is less interested in raising and uplifting their citizens? In Russia, what can you do with high social credits? Probably nothing. In China maybe it means that you have access to more money or insurance or house or something like that. But is the Russian state is going to provide you with that? Putin might be interested in the repressive side, In China they have to manage 1,4 billion people whereas the Russian population is shrinking rapidly.
TN: You also mentioned that we need to take some steps back in order to develop a long term strategy for decentralized and sustainable collaborative networks.
GL: I strongly believe in this. The Chinese government may not really allow such experiments. But there are other societies in which we still have an opportunity to build decentralized, federated networks, local initiatives on digital commons through concrete forms of collaboration and exchange.
TN: Do we need to go back to Fidonet or Usenet to see how we can develop a solution? What is the turning point where we can fix this broken internet?
GL: I do believe in local networks because I have experienced them first hand, in Amsterdam and Berlin, but also look at Spain and Italy. We know what these networks meant. We know the people we are communicating with and our communication is task and goal oriented, and creates the social in real. I think of my son’s soccer club. Current social media are not well equipped for that. What they produce are endless streams of interpersonal micro exchanges. Collective decision making is absent in the dominant interfaces. There, everything is focused on updating and ‘news’. This is a corporate strategy, to focus our attention on the latest, but that’s not the point at a local level. Think of the old Twitter phrase “What are you doing today?”
Social networks should no longer be profile-centric. Everything now is circled around the gateway of the profile. Without a profile it becomes more difficult for companies like Facebook, Google, Apple, Amazon and so on to analyze what we’re doing. Take the toys from the boys: disassemble the addictive distraction techniques. We could focus on new forms of dialogue and integrate tools for getting things done. But first we need to start with a critique of the old Web 2.0 premises. One of the problems, for instance, of the blogging logic is that there is a statement, and then there’s some space below for individuals to respond. If you look at the structure of social forums is that it is not focused on this one digital object. It is much designed like a flow.
TN: Speaking of future architectures… you mentioned a more egalitarian way of communication. Is the blockchain technology such a thing?
GL: Yes, some of its principles are good. The reality is not so decentralized—and we need to ask why. Distributed databases appeal as values that I indeed share and promote. Needless to say that the Internet has become very centralized. The blockchain is not really meant for discursive processes per se. It is far too boring for that. In the end, nonetheless it is something that we can experiment with, change with. And please join the debate about that. Because yes we are in an exciting new time in which a lot of mistakes that we made by previous Silicon Valley generations will have to be repaired and our generation will have to stand up and say we don’t want those tools anymore. We need alternative architectures. For instance those in which we as a content generator— journalists, creative people—will be able to make a living, in which our financial concerns are integral part of the architecture. And we are talking about cryptocurrencies such as bitcoin. This is maybe a still primitive way of doing that. But nonetheless this is a step in a good direction.
TN: What’s your opinion about the dark web and what happens there?
GL: You just have a number of services that cannot easily be accessed by crawlers, by search engines and authorities. This is in itself interesting. Why is this still a possibility? It means that the development of the medium has not yet finished. And that is why there is hope. Why do we two have this conversation? Because we know that we can still repair, that there is still a tiny window of opportunity to steer internet culture in another direction, That’s the good side of the dark web. Whether the content is interesting I doubt. I don’t’ find it very exciting. But the principle there is exciting.
TN: We were talking mostly about global networks from a global perspective. If we scale down to the local level, we suddenly encounter the ideology of the smart city.
GL: Smart city solutions are the opposite of the local. It is a global solution that being rolled out by companies such as IBM to solve traffic problems, parking, control flow of citizens in order to optimize capacity, do surveillance on the streets, monitor the speed of cars, to see who is walking out on the streets, to do crowd control. These are global software packages that a municipality and private infrastructure companies purchase, including expensive consultants connected to one of the global auditing firms. One of the first thing I would do is away cameras and sensors in public spaces to bring freedom back to the city.
TN: Would that make the city local again?
GL: That’s a good question. How can a city become idiosyncratic again? Is there something like urban serendipity? We all k now the global developments such as growing economic inequality, expulsions (Saskia Sassen) and global warming. We face a real dilemma: how can we break away from the global logic without conflict and… war? How can we imagine to take matters in our own hands, become autonomous and create a new form of ‘isolation’ that is productive and creates beautiful local cultures. We need to open up that new space. At the moment this is done under the banner of terrible forms of nationalism, ethnocentrism and racism in which the local is being recreated. That will never be productive, in my view. Let’s overcome the regression, and come together.