Dividual Privacy in the Era of Algorithmic Profiling

This review was first published in the journal Global Media and Communication. You can find it here.

Review: John Cheney-Lippold, We Are Data: Algorithms and the Making of Our Digital Selves, New York: New York University Press, 2017, 315 pp; $ 27.73, (hardback). ISBN: 978-1-4798-5759-3.

by Patricia de Vries

One might say, in the spirit of Langdon Winner, that ‘algorithm’ is a word whose time has come. The claim that algorithms shape, organise and co-produce everyday life has given impetus to anxieties about the present and future in the light of these developments. It seems ‘algorithmic culture’ (Dourish, 2016; Galloway, 2006; Striphas, 2015) has become a shorthand for a nexus of concerns about the entanglement of the social and the algorithmic. Over the past decade, these concerns have manifested in a growing field of study: algorithmic studies. Drawing on software studies, race and gender studies, philosophy of technology, ethics, media studies, STS, and social sciences, thorough critical research has been conducted on how algorithmic systems are deployed in a wide range of domains affecting access, opportunities, and livelihoods of populations (e.g. Browne, 2010; Chun, 2009; Diakopoulos, 2014; Hansen, 2015; Kitchin, 2016; Mackenzie, 2006; Nakamura, 2009; Bucher 2018). Anxieties about algorithmic culture found a larger audience with popular titles such as Automate This: How Algorithms Came to Rule Our World (Steiner, 2012), The Black Box Society: The Secret Algorithms that Control Money and Information (Pasquale, 2015), and Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (O’Neil, 2016). 

John Cheney-Lippold’s We Are Data: Algorithms and the Making of Our Digital Selves is part of this expanding field. He contends that ‘the knowledge that shapes both the world and ourselves online is increasingly being built by algorithms, data, and the logic therein’ (p. xiii). The digital data we produce and leave behind is algorithmically interpreted, categorised, and employed by, amongst others, state security programs and for-profit corporations, on their own terms, for their own purposes and according to their own private and proprietary algorithmic logics. The digital selves we get assigned by the algorithmic systems of capital and state powers emerge in the constant and dynamic interplay between the data we produce and the different state and corporate powers interpreting that data. These datafied versions of ourselves are marketing, administrative and security categories, and used for concomitant purposes; they are not ‘who we really are or who we choose to be’ (p. 25). Numerous (if not countless) actors constantly track, mine and categorise every (inter)action online, all according to their own rubrics, resulting in competing, modulating interpretations of data that determine who we are according to an algorithmic logic (p. 29). As long as the gap between ‘who we really are’ and our ‘digital selves’ cannot be fitted into zeros and ones, as far as our embodied individualities cannot be translated into abstract formulas, they are irrelevant (p. 197). As a consequence, ‘we increasingly lose control not just over life, but over how life is defined’, Cheney-Lippold claims (p. 5). We Are Data enquires into the making of these so-called ‘digital selves’, how they are used to market, surveil and control us, and what can be done about it.

In the first part of the book, Cheney-Lippold explains how multifaceted and multilayered embodied experiences and individualities such as gender, race, and age are flattened out and simplified in order to be algorithmically interpreted to construct measurable types. In the second part of his book, he describes how these measurable types form the backbone of what he calls soft bio-politics, and how measurable types are rendered useful to corporate and state powers. Cheney-Lippold does not forego appealing to emotions: ‘We kill people based on metadata’, reads the opening line of the first chapter, citing the former director of the National Security Agency (NSA), Michael Hayden. Who is considered or categorised as a ‘terrorist’ by the NSA is, in part, the result of algorithmically processed categorisation of metadata. What occurs is a sort of if-the-shoe-fits typecasting in which one’s data is fitted into the algorithmic mould of the measurable type and classification of ‘terrorist’. We don’t account for ourselves in relation to who we are algorithmically made to be, it is algorithms that both interpret and speak for us (p. 192). These measurable types, or moulds, are not set in stone, he explains; as a result of modulation, they are dynamic allowing for different measurable types defined by different actors – that themselves remain in the dark – according to different parameters at different times and in different contexts, but all in service of their capacity to control. These datafied selves may, in exceptional cases, result in a ‘kiss of death’ carried out by a US drone.

The last chapter is dedicated to how we can throw sand in the machine, how we can ‘mess with algorithmic assessment’ (p. 199). More specifically, how the concept of privacy can be remobilised and revitalised to respond to the regulating power algorithmic classifications wielded by state and corporate actors. Here, Cheney-Lippold returns to the moment in history in which privacy was first formulated as the right to be let alone in the context of newspaper journalists using the new technology of photography to peer into and take pictures of the homes of the rich. He shows awareness of the extent to which privacy has historically been representative of the liberal-bourgeois legal traditions of privileged white men, protecting them against intrusion by external parties. And despite critical reflections on privacy by socialist thinkers (who reject the concept’s proprietary, possessive and individualist premises), by feminist thinkers (who reject its functioning as a sanctioning of the patriarchal organisation of the domestic sphere), and by race and decolonial thinkers (who reject the concept’s racist premises of who counts as a person deserving of a right to privacy and who does not), Cheney-Lippold still proposes a ‘dividual’ conception of privacy as the way forward because of its alleged potential to ‘cut across the different gendered, racial and classed layers of our social world’ and hold power in check (p. 209). Dividual privacy, he contends, provides a ‘productive breathing space’ by disconnecting us from identification, and disrupts the attempts by state and capital to classify us (p. 235). He uses the TrackMeNot application, a browser plug-in that messes with pattern analysis and makes unique identifiers nonsensical, as an example of such an obfuscation strategy. Here, We Are Data loses its bite.

Although I agree that personal data should be seen as an ‘extension of the self’ and that we should ‘treat it with the same respect we would a living individual’ (p. 235), his notion of dividual privacy is framed as an informatic condition to which non-identifiability or non-traceability is offered as an, albeit partial, answer. In this framing, privacy is still thought of in traditional terms: to protect against intrusions by external actors. In this context, TrackMeNot provides the ability to throw sand in the search machines of four different search engines. The concerns Cheney-Lippold raises throughout We Are Data are severe – the loss of ‘self-determination’, of ‘possibility’, of life itself even. What is more, it affects populations unevenly. The gibberish and obfuscation strategies CheneyLippold hails dehistoricize and depoliticize the socio-cultural, economic and epistemic backbone which fuels the production and exploitation of some digital selves more than others. The obfuscation provided by such browser plug-ins is tied and limited to the digital domain, and thereby remains rooted in the problematic liberal-bourgeois tradition of coding space, whereas the lived consequences of algorithmic profiling cut across divisions such as private and public, digital and real selves. Further, dividual privacy does not disrupt the positivist belief that data ‘speaks’, nor the capitalist belief in ‘growth’, and maintains the liberal-bourgeois belief that information equals power. It is these very notions of data-power and values such as ‘growth’ that give rise to measurable types of digital selves Cheney-Lippold is concerned with, but that remain virtually undisturbed in this book. 

References 

Browne S (2010) Digital epidermalization: Race, identity and biometrics. Critical Sociology, 36(1): 131–50. Bucher T (2018) If … Then: Algorithmic Power and Politics. Oxford: Oxford University Press. 

Chun WHK (2009) Introduction: Race and/as technology; or, how to do things to race. Camera Obscura, 24(1(70)): 7–35. doi: https://doi.org/10.1215/02705346-2008-013 

Diakopoulos N (2014) Algorithmic Accountability Reporting: On the Investigation of Black Boxes. New York, NY: Columbia Journalism School, Tow Center for Digital Journalism. 

Dourish P (2016) Algorithms and their others: Algorithmic culture in context. Big Data & Society, 3(2): 2053951716665128. 

Galloway A (2006) Gaming: Essays on Algorithmic Culture. Minnesota: Minnesota University Press. 

Hansen MBN (2015). Feed-Forward: On the Future of Twenty-First-Century Media. Chicago: University of Chicago Press.

Kitchin, R (2016) Thinking critically about and researching algorithms. Information, Communication & Society, 20(1): 14–29.

MacKenzie, D (2006) An Engine Not A Camera: How Financial Models Shape Markets. Cambridge: The MIT Press.

Nakamura, L (2009) The socioalgorithmics of race: Sorting it out in Jihad worlds. In: Gates, K, Magnet, S (eds) The New Media of Surveillance. New York, NY: Routledge.

O’Neil, C (2016) Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Largo: Crown Books.

Pasquale, F (2015) The Black Box Society: The Secret Algorithms that Control Money and Information. London & Cambridge: Harvard University Press.

Steiner, C (2012) Automate This: How Algorithms Came to Rule Our World. New York, NY: Penguin Group.

Striphas, T (2015) Algorithmic culture. European Journal of Cultural Studies, 18(4–5): 395–412.

Share