This party never stops, time is dead and meaning has no meaning. Existence is upside down and I reign supreme.
My daughter loves reciting whole dialogues from Disney’s animated series Gravity Falls, but this particular quote, by the show’s demonic antagonist, has really stuck with me as a prophetic description of our post-postmodern time: “This party never stops, time is dead and meaning has no meaning”.
I remember how back in early 2014, I was a guest on Dutch television arguing that King William Alexander and Princess Maxima shouldn’t go to Russia and drink champagne with Putin. It was just a few weeks prior to the Russian invasion of Crimea. The host asked me something along the lines of “Why should we be careful with Putin, what danger, what threat does he pose to the West?” to which I answered, “Russia’s main export is not oil and gas, but the idea that there is no truth.”
Year after year, the Kremlin has been pouring hundreds of millions of dollars into this particular export. Putin has signed a decree forbidding budget cuts in this area and gave out military awards to the heads of the media corporations producing brain fog.
Whether it was Russia’s grey cardinal, Putin’s former advisor and postmodernist writer Vladislav Surkov’s idea or one of Putin’s own psychological manipulation strategies, there was a point in Russian foreign politics when this whole post-postmodern surrealist tactic that nothing is what it seems, everybody lies, or everybody has their own truth, started spilling abroad.
It wasn’t about censorship, it was about deflating once meaningful words such as ‘war’ or ‘nazi’ to empty sounds. It wasn’t about keeping secrets but creating so many wild and vastly different versions of every event in what was already a very informationally saturated environment, that the audience would start thinking the truth was unknowable. It wasn’t about expressing an opposing view but about attacking what is called “capillarity” or organic, meaningful interconnectedness and organic discussion on the web by fabricating swarms of fake identities at troll farms that would break up meaningful connections and make discussion futile.
The Oxford Internet Institute which studies state-controlled media, published a report in 2020 stating that one of RT’s objectives was to encourage conspiracy theories about media institutions in the West in order to discredit and delegitimize them.
Of course, this gaping abyss between image and meaning is hardly Putin’s invention. Jean Baudrillard’s theory of empty simulacra has become a classic in media studies, exacerbated by digital technologies.
Back in 2006, I wrote my MA thesis at the Media department of the University of Amsterdam about how the image of Islam in the media has lost its meaning. I must apologize for the graphic allusion, but sinister performances of the terror attacks in the early 2000s were like a postmodern reenactment of René Magritte, especially Golconda, depicting well-off men in suits suspended in mid-air. Everything was for the show, for the horrific theatrical effect. The political leaders behind fundamentalist suicide teams manufactured their own simulacra of Islamic ideology and disinformation not because they believed in it, but because it was their strategy and it produced a terrifying effect.
The Kremlin is doing something very similar, but taking it a whole lot further: Ceci n’est pas une pomme/ pipe. “This is not war”, “This is not pregnant women being shelled”, “This is not a trade center being targeted”, it says, waging a bloody war and targeting civilians in open daylight, its main goal to spread terror. Writer Vladimir Sorokin has even suggested Putin may not be hoping for a victory, but doing everything purely for the horrific effect, for media, and for historic attention. Like a suicide terrorist. And Russian film director Sergei Loznitsa has observed how in reaction to hopeless, desperate life situations humans rely on surrealism as a coping mechanism.
In November 2021, Surkov penned yet another essay, openly stating that Russia should be pushing its tension and “social entropy” (read: export terror and chaos) across its borders, relieving itself in a weird expression of the second law of thermodynamics.
It is in this uncertainty (that is, of course, not only created by the orchestrated disinformation campaigns but by the very fast pace of change in the world today) that even in the democratic world, many people are extremely gullible to conspiracy theories and manipulation because it helps them to at least find simple explanations/culprits responsible for their misfortunes in the otherwise chaotic sea of information.
In the ‘Gravity Falls’ animated series, the diabolic antagonist says he liberates the world this way, by stripping it of meaning. Indeed, if facts no longer matter, if the borderline between war and peace is smudged, there is no grip, no future. Time is dead. WW2 never ends, fascists keep coming back. Anything you do doesn’t matter anymore. Any crime. Any mass murder.
So what should be our coping mechanisms in the face of this, as disinformation researcher Peter Pomerantsev put it, postmodern politics of “radical relativism”?
I was quite taken aback upon hearing that a few of my European colleagues from the Freedom of Information Coalition have launched a complaint about the legitimacy of the EU-wide blockade of Russian state channels, such as RT and Sputnik, “calling the proportionality of the measure into question”. In May 2022, they signed a “petition that seeks to preserve access to all information” and sent it to the European Court in Luxembourg.
At a time when everyday political decisions about military support of Ukraine literally decide not only the fate of Ukraine but of Europe and the West, entities that still have meaning, albeit fragile, such as European NGOs spend their resources on trying to help the Kremlin’s main instruments of manipulation back into the EU and hence empower them to influence those political decisions.
Cybersecurity experts define two main components of cybersecurity. Protection from coordinated disinformation and harassment campaigns is number one. One must add, especially at a time of unprecedented aggression.
Number two is putting internet users in charge of their private data, to make sure that the likes of Cambridge Analytica never get hold of it. Perhaps something that can’t be achieved as urgently but is being very actively talked about now, in terms of self-sovereign identity: DIDs (decentralized identifiers) and decentralized networks, technologies that should enable all of us to store our data in our own wallets and nodes instead of on social media accounts.
The same technology will make it very easy to verify social media accounts as genuine, as opposed to automated bots or fake identities.
Independent researchers are also busy developing verification AIs, as well as strategies to train social media algorithms and to hold tech giants accountable.
A recent MIT Medialab study published in Nature found that “the current design of social media platforms—in which users scroll quickly through a mixture of serious news and emotionally engaging content, and receive instantaneous quantified social feedback on their sharing—may discourage people from reflecting on accuracy”. The researchers suggest “interventions that social media platforms could use to increase users’ focus on accuracy. For example, platforms could periodically ask users to rate the accuracy of randomly selected headlines, simultaneously generating useful crowd ratings that can help to identify misinformation. Such an approach could (in the long run) potentially increase the quality of news circulating online without relying on a centralized institution to certify truth and censor falsehood”.
Yet, the most powerful antidote to manipulation in the free world is media literacy. This is why we should stop enforcing the stereotype of the web as a dark, dangerous place and instead see it as a tool for digital democracy, meaningful connections, and knowledge. The algorithms reinforce what we make of it.
While collecting signatures in support of independent Russian journalists and having (with the help of my children) built a Discord server for these journalists, I was shocked at the low levels of tech literacy among my colleagues.
It is essential that kids and teens, currently subject to inhumane ageist discrimination and screen control culture, imposed by regulators who lag years behind in their understanding of technological processes, get better access to the web and the emerging metaverse. It is those who have been highly regulated and have been unable to teach themselves basic media literacy that easily falls prey to manipulators.
Those who have only been allowed a couple of hours per day online, exhausted, are used to regarding this time online as their unwind time of meaningless scrolling. Let’s enable a new generation of meaningful creators, respected in their right to get all the time they need to get to the bottom of things in the world’s pool of knowledge. Let’s be there for them to hear them out, instead of controlling them.
In July, INC has published a new 44th edition of Theory on Demand, called Dispatches from Ukraine: Tactical Media Reflections and Responses, which includes the following piece. Order a physical copy or download the whole publication free of charge here.
The image on the cover: Still from Guido van der Werve, Nummer Vier, 2005.
 Elswah, M., & Howard, P. N. (2020). “Anything that Causes Chaos”: The Organizational Behavior of Russia Today (RT). Journal of Communication, 70(5), 623–645. https://doi.org/10.1093/joc/jqaa027.
 Chazan, G. (2022, June 24). Writer Vladimir Sorokin: ‘I underestimated the power of Putin’s madness.’ Financial Times. https://www.ft.com/content/1f4bd315-7753-4e7a-be4e-0ea7e31522b9.
 Peter Pomerantsev, This Is Not Propaganda: Adventures in the War Against Reality. Faber & Faber, 2019. https://www.ft.com/content/1f4bd315-7753-4e7a-be4e-0ea7e31522b9.
 Applebaum, A., & Pomerantsev, P. (2022, February 16). How to Put Out Democracy’s Dumpster
Fire. The Atlantic. https://www.theatlantic.com/magazine/archive/2021/04/the-internet-doesnt-have-to- be-awful/618079/.
 Pennycook, G., Epstein, Z., Mosleh, M., Archer, A. A., Eckles, D., & Rand, D. G. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 590–595. https://doi.org/10.1038/ s41586-021-03344-2.