Interview with Emily Rosamond and Arthur Röing Baer
by Cristina Ampatzidou and Ania Molenda
Amateur Cities for Moneylab #3 Failing Better
CA: Could you introduce yourselves – what do you do and what are your main interests?
ER: I’m Emily Rosamond. I’m an artist, writer and lecturer of Fine Art Theory at the Arts University Bournemouth. I have a solo practice as an artist and I am a part of the art collective the School of the Event Horizon, with Kate Pickering and Steven Levon Ounanian. I also write about contemporary art and finance in various ways.
ARB: My name is Arthur Röing Baer. I originally come from a design background, including its commercial side and advertising, from which I have pivoted into design theory, contemporary political design practices and their possibilities. From this I have turned to digital urbanism where I see huge disruptions coming, paired with an equally huge responsibility for designers to come up with alternatives. That’s my current focus.
AM: You participated in two different panels at Moneylab #3. Emily in ‘Big Pocket Is Watching You’, centring on financial surveillance, and Arthur in ‘Cooperatives and Commons’, where among other topics platform cooperativism and distributed ownership were discussed. Distributed or collective forms of ownership make large-scale data collection indispensable. What do you think is the price we might have to pay for developing such forms of collaboration?
ARB: The conversation about privacy has been going on for a long time and it’s important but it is also often based on privilege. Being able to step out of such platforms is based on privilege and most people can’t do it, because their livelihood depends on using them; for example Facebook is used as third party identity validator on gig platforms. So instead of leaving the platforms we should focus on how we could use data’s unique quality of increasing in value with quantity to make new models where our data can be used for shared advantage. And then on how these models in turn also enable or inhibit commercial exploitation of the data. Who has access and why?
ER: My PhD is titled ‘Economies of Character (or, Character in the Age of Big Data)’ and one of the things I have been focusing on is the fact that we face both a massive aggregation of data and a state of vast inequality with regard to who can access it and under what circumstances. This has obviously become apparent to the public through the 2013 Snowden revelations. Paradoxically, the passing of the ‘USA Freedom Act’ in 2015, which limited the NSA’s ability to hold dossiers on people, was considered a win for the NSA and the intelligence community, because it was too expensive for them to store that data anyway. Because of this act, they can mandate private corporations to store the data and still access it whenever they want, with just a few extra steps. So, in spite of the fact that the NSA’s privileged powers seem to have been limited, in fact they have not.
Stickiness in platforms such as Facebook is a huge deal. These platforms are so geared towards habituating users, so that people won’t leave even if they became dissatisfied with the conditions. For instance, I dislike Facebook, but I’m still on it because everyone I know is and it seems to be the default. So, a massive inequality as to who can own data, who has enough power and money to store it, and who manages and controls the infrastructure is coupled with rampant individualism that is predicated on these platforms. I’m interested in theorizing reputation capital – how clicks, likes, friend numbers and all these metrics of sociality encourage people to actually stay isolated and individualistic on platforms where data is being mined on a massive scale. I am thinking about this irony a lot. I don’t have a good answer on how to break out of that, but I’m trying to think through the implications for art discourses. That’s what we are up against – we are trying to tip that scale away from massive aggregation and massive individualism, and it has to do with the conditions of the platforms that we are on.
CA: We are surveilled on an individual level even though our data makes sense only when it’s compared or aggregated in bunches. We have personal ways to protect ourselves – such as deleting our profiles from Facebook, but are there also collective ways of protection we can come up with?
ARB: The stepping out answer and the idea of starting your own platform is bullshit because most platforms are based on a network effect, so if you start a platform with 50 of your friends – they have to be really powerful people for it to make sense. What I’m interested in is changing the underlying logic of how these platforms work. I’m not opposed to them in any way and I think many are great inventions, but their underlying principles are problematic. Looking at changing these is where it gets really interesting. I think that is the most hopeful aim for real change in this area.
ER: For a lot of people, withdrawing from social media networks is not necessarily an option – or at least it’s not a sustainable one. In fact, given that many people feel they must participate in online forums, privacy is coming to be understood as a luxury that can be bought and sold. Michael Fertik, the founder of reputation.com, argues that there is an economy of privacy. If privacy is something that people like, maybe it’s something that needs to be sold at a higher price.
ARB: Buy back your data you mean?
ER: Exactly, buy it back, even though it was given away for free. Despite some very good activist projects like Wages For Facebook, people aren’t really used to the idea that they should get paid for their data because they are generating value for somebody else just by clicking on stuff online. Of course this is just an extension of unpaid labour.
ARB: At the same time it is interesting to see how little data is actually worth. One Google account is worth about 100 euros, changing drastically depending on which demographic segment you belong to. So if Google would open it up, many people could buy them back. We are contributing relatively very little value.
Wages for Facebook (ep. 2) by Inhabitants
ER: What is interesting is that it is not really about individual data. It’s about how this data fits into an aggregate picture. Ian Hacking in his book ‘The Taming of Chance’ unfolds the long history of understanding aberrant, abnormal and unusual behaviour as normal, once filtered through a meta layer of statistics. Hacking traces this back to the end of the Napoleonic era, when suicides, for instance, began to be tallied. This allowed people to observe a high degree of regularity in the rates of such abnormal behaviours per year – and thus to understand irregular behaviours as somehow also regular. In our time, many individuals are beholden to the power of statistical aggregates. Annie McClanahan’s research on character and creditworthiness refers to the example of AMEX customers who have received letters that told them: because customers who shopped at similar establishments to you showed a lesser propensity to pay back their line of credit, your credit score will go down. There is a correlative logic that comes along with this aggregation. So it is not that your data or my data have lots of value for Google (for instance). It’s the ability to aggregate and process it in a sophisticated fashion that leads to data’s great value.
ARB: It is the tyranny of correlation as something one cannot do anything about.
ER: Absolutely! Jaron Lanier is really great on differentiating between two different understandings of what big data is. For science the challenge is to understand what scientific process looks like in an era of big data; progress is necessarily slow on this front. On the other hand, there is big data for businesses and finance, where any correlation that’s discovered before everyone else catches on becomes profitable – at least for a short time – simply because it is acted on, and perceived as valuable.
ARB: Which is creating feedback loops because these correlations often reinforce behaviours or patterns of the system within which they are observed, high frequency trading is one of such examples.
Jaron Lanier on The Surveillance Economy and Extreme Income Inequality: You Can’t Have One Without the Other
CA: Do you think that distributed ownership could mitigate issues of inequality of access and ubiquity of surveillance? How could the necessity of transparency be balanced with the protection of privacy?
ARB: Even if we manage to create cooperatively owned distributed databases, replace an exploiting third party with a transparent protocol (such as blockchain) and share the productivity gains from pooling of data, the problem will persist. Despite eliminating the internal third party an external company can still mine that data for various correlations and profit from them.
ER: I totally agree. It’s exactly one of the big problems that people like Trebor Scholz are doing really good work on. There is a lot to be said on cutting income inequality and addressing what George Monbiot recently called klepto-remuneration in the Guardian. This refers to CEOs who pay themselves huge bonuses to cut everyone else’s wages. I think Jaron Lanier points out really well that as cooperative or collaborative platforms seem to rely on volunteer labour, there are a lot of problems related to the fact that they operate within an informal economy. He asked people who haven’t been included in the formal economy what they think of the sharing economy and they said it’s crazy, because there are good reasons to be part of the formal economy; you might be sick or wish to retire at some point. It’s just not the case that the sharing economy can break capitalism in a simple way. As Silvia Federici points out, unremunerated labour – and particularly women’s labour – have always been at the heart of capital accumulation. It’s hard to think through these complexities of sharing with respect to the big picture of valuation. Of course, volunteering can be wonderful; and yet in many cases, it is by no means ‘outside’ of capitalist regimes of accumulation, since it is capitalized on by platforms or managers.
CA: Mediators, as Brett Scott pointed out very clearly, have the power to surveil. Do you think there is a way of inverting the gaze in the cooperative models where the surveilled could monitor the surveillant?
ER: I was really intrigued when Evgeny Morozov asked a really simple question: ‘What if people understood email as public infrastructure?’ Instead of putting a stamp on an envelope and sending it, Gmail, for instance, sticks a couple of adverts in our email so we can send it for free – that’s the cost, right? What if we thought about email as part of public infrastructure, so that citizens would have more power to talk back?
I have absolutely no idea what it would take to think of email as part of public infrastructure or as a common, but I think it’s an interesting thought experiment. There is also a history of complex relationships here between public research and private gain. DARPA (the Defense Advanced Research Projects Agency), for instance, funded a lot of internet developments, which eventually went to a handful of Silicon Valley CEOs, resulting in an enormous public expropriation. Even though I cannot wrap my head around what it would take to make email, AirBnb or Uber public, it’s a really important point to think through.
AM: Arthur, your project on infrastructure seems to be related exactly to this question.
ARB: Yes, I am very interested in this idea of digital platforms being public utilities. Why for example is Uber not a part of our public transport system, gaining efficiency through pooled data and helping the project of mobility for all?
And one problem here with digital platforms run by the state is that they can be dissolved easily, and privatization is always an easier narrative than nationalization. This is the switchback where the left always loses. I would love to see Uber nationalized, but I am afraid that this could be exercised as a form of state driven violence on the entrepreneur that could be used to attract voters and dissolved instantly after seizing power, with no physical infrastructure except for the database that could be appropriated for any purpose. Intermediaries such as the Commune system  I am proposing could be seen as a warranty for public infrastructure not to be dissolved when another goverment with intent to privatize comes to power.
AM: That makes me think of the notions of risk and responsibility. How can they be distributed in cooperative structures? Is there a way, except for nationalization, to ensure continuity?
ARB: Physical infrastructure can have a large social impact and is simply really expensive to remove. The big problem with digital infrastructure is that it can easily be flipped – you change the code and it belongs to somebody else.
ER: Interesting, isn’t it? Data storage is phenomenally expensive but actual platforms are not. Yet again Trebor Scholz pointed to this really beautifully when he said that Uber, for instance, is practically nothing; it’s just a bit of code, and that gives it the potential to flip, to lose its enormous monopoly. On the other hand, just like you said, if platforms were somehow nationalized, the minute a new government comes in, everything could disappear.
I subscribe to Jonathan Nitzan’s and Shimshon Bichler’s theory of capital as power published in ‘Capital As Power: A Study of Order and Creorder’, in which they theorize capital as power and power as confidence in obedience. This confidence in obedience allows big players to get away with not bearing the same risks that smaller players do. In 2008, with the US financial bailout, we saw that risks were placed on individuals; they were expected to bear responsibility for the failures of those at the top. Banks and CEOs were not expected to bear responsibility at all. There has been so much good work done on pointing this out, but the question of what to do to respond to this massive aggregation of power remains more difficult. The authors of ‘Capital as Power’ are in fact incredibly pessimistic; there is not a happy ending to their story.
CA: Do you expect similar phenomena to occur within structures that are cooperatively managed? Forms of representation and management are needed there too. Why would accountability be expected to work differently in these scenarios, what is the factor that makes the difference?
ER: Tranches of management might be quite interesting here, as the interaction between people in different management levels tends to be extremely limited. Within any of those management layers, let’s call them, all kinds of power dynamics can develop and there is a really interesting ethical practice involved in being attentive to these dynamics. But that’s not necessarily enough to make it translate upstream. As Michel Feher puts it, that might have something to do with understanding one’s position as an investee, and trying to think through what can be done with that – to withdraw if necessary, or to intervene in what is being invested in you, as a worker or as a beneficiary.
 A blockchain is a distributed data store that maintains a continuously growing list of data records that are hardened against tampering and revision, even by operators of the data store’s nodes. The most widely known application of a block chain is the public ledger of transactions for cryptocurrencies, such as bitcoin. This record is enforced cryptographically and hosted on machines running the software. (definition by Aeze Soo via P2P Foundation) Recommended reading: https://aeon.co/essays/how-blockchain-will-revolutionise-far-more-than-money
 You can read more about this topic in Michel Feher’s paper ‘Self-Appreciation; or, The Aspirations of Human Capital’ or watch a lecture he gave at Goldsmiths titled ‘Lecture 8. Investee Activism: Another Speculation is Possible’