Killing Intelligence: Death by Tech and other Ordinary Horrors in Gaza

The Gospel, the latest AI gadget deployed by the IDF in Gaza, paves the way to a new mode of warfare that can further encourage the dehumanization of human beings behind a narrative of technological progress, efficiency, and accuracy. But Gaza shows us another danger too: even when AI is used to “rehumanize,” who decides who deserves to be treated as a human and who should remain in oblivion?

Illustration by Jafar Safatli

‘Stop, do not proceed’, the red flashing light suggests. But if the green color pops up, then it’s ‘a go’ and you may move on. Much like the familiar traffic lights accompanying the ordinary routine of our urban lives, ‘Gospel’ unfolds its lethal narrative of destruction in vibrant shades of red and green. Similar to the mundane traffic lights, it transitions mechanically from one color to another, oblivious to significance. However, unlike its urban counterpart, the objective of this machine learning system is not to safeguard life but rather to decree death.

The Gospel, or Habsora in Hebrew, is the latest tech gadget deployed by the IDF to wage war on Gaza. Building on available data, this AI identifies potential targets within the Strip and calculates the civilian casualties that might theoretically result from striking on those spots. Essentially, the system offers an estimation of probable fatalities and evaluates the strategic value of targeting specific areas. This process relies on probabilistic inference, which involves analyzing previously gathered data to discern patterns and make recommendations based on probability. In stark contrast to the Greek etymology of its name, euangelion, implying good news, the Gospel has instead unleashed unparalleled death and devastation upon the Palestinian soil. Yet, despite its biblical namesake suggesting absolute truth and an infallible message, Habsora, like all machine learning systems, operates on correlation rather than causation and likelihood rather than certainty.

In addition to reinforcing the misleading human inclination to view machines as supposedly impartial and objective – which is a key factor shaping contemporary AI ideology and the ‘hype’ behind it – the integration of such technologies within military frameworks precipitates a drastic escalation in the pace of warfare and, consequently, in the dispensation of lethal force at an unprecedented speed. In what is perceived as a God-like manner, this AI possesses the capacity to analyze immense volumes of data and formulate decisions at rates far surpassing human cognition. As indicated by a former head of the IDF, while human intelligence analysts typically identify around 50 bombing targets annually, Habsora can generate up to 100 targets per day, accompanied by real-time recommendations regarding which targets should be prioritized. “We use an algorithm to evaluate how many civilians are remaining. It gives us a green, yellow, or red, like a traffic signal,” a former senior Israeli military source emphasized in an interview with the Guardian.

In promotional videos featured on the IDF website, the Gospel is depicted offering its identified targets to the military. Accompanied by a crescendo of dramatic music, the sequences evoke a video game-like atmosphere, with a viewfinder scanning, locking onto, and ultimately destroying targets. Buildings, appearing desolate and isolated, are methodically and triumphantly demolished. These promotional visuals echo the rhetoric of precision, accuracy, and effectiveness consistently emphasized by the Israeli military in public statements. They portray Palestinian territory as a mere array of targets to be aimed at and fired upon, rendering it a vacant landscape devoid of its inhabitants. Essentially, an empty wasteland. AI orchestrates war efforts with a cold efficiency, speed, and a facade of ‘rationality’, transforming the battlefield into a clinical checklist, much like an assembly line in a manufacturing plant. At the same time, everything feels familiar and non-threatening, resembling the innocuousness of an urban setting with traffic lights, as if it were unrelated to the killing of humans and the annihilation of civilian infrastructure.

Within the Gospel’s psychogeography, where red, yellow, and green markers emerge on the Gaza map, the Palestinian people have been rendered into “power targets“. These targets extend beyond purely military objectives to include private homes, high-rise buildings, public facilities, and critical infrastructure. According to an investigation carried out by +972 Magazineand Local Call, since the early days of the current war on Gaza the number of these power targets, alongside “operative homes” suspected of housing a single resident affiliated with Hamas, has outweighed the count of purely military targets. As outlined by various sources referenced in the article, long-standing military protocols in Israel typically mandate that strikes on power targets occur only when the buildings were unoccupied during the attack. However, a notable departure from this norm has occurred since October 7, 2023, as evidenced by a multitude of news reports and social media snippets documenting the unprecedented annihilation of civilian infrastructure – and of civilian lives – in Gaza.

An Emerging Mode of Warfare?

Technologies like the Gospel act as tools for reinforcing an ideology promoting the dehumanization or inhumanization of the Palestinian people; a dynamic that has dramatically accelerated in the post-October 7 scenario.

Following the Hamas’ attack on October 7, 2023, technologies such as the Gospel have become key in advancing a warfare strategy that emphasizes remote engagement. This approach is deeply rooted in a global surveillance and automated warfare ideology, prioritizing notions of efficiency and faultlessness. This global trend of conducting warfare ‘from a distance’ further exacerbates the long-standing process of dehumanizing or inhumanizing Palestinians, a pattern established over many years that is closely associated with the indiscriminate killing of civilian lives.

The strategy of unleashing extensive harm upon the civilian population, in fact, is not novel for the IDF. Rather, it constitutes an expansion and intensification of the so-called Dahiya Doctrine, established in the 2006 conflict with Hezbollah. The underlying logic is that by employing disproportionate force against civilian targets, Israel would aim at compelling the populace to exert pressure to halt the guerrilla group’s offensive. At the same time, this has served as a propaganda device targeting the Israeli public with the aim of emphasizing the extensive damage inflicted on the enemy and showcasing the military’s accomplishments, in an effort to attract as many likes and shares as possible and win the public’s hearts and minds.

However, a fundamentally new dynamic is at play concerning power targets and the decimation of civilian life and infrastructure in Gaza post-October 7, 2023. There is an underlying belief that Gaza’s population is inherently complicit in the actions of Hamas, making it impossible to disentangle the two entities. In an op-ed for the Wall Street Journal, IDF Spokesman Rear Adm. Daniel Hagari asserts that “Hamas has systematically embedded its terror infrastructure inside and under civilian areas in Gaza as part of its human-shield strategy.” Similarly, a former intelligence official remarked that “Hamas is everywhere in Gaza; there is no building that does not have something of Hamas in it, so if you want to find a way to turn a high-rise into a target, you will be able to do so.” As Israeli President Isaac Herzog has unequivocally stated, “It is an entire nation out there that is responsible.” This logic of ’embedment’ provides the rationale behind the unprecedented violence unfolding on the ground, a situation UN Secretary-General Guterres characterized as “a collective punishment of the Palestinian people.”

Palestinians have been offended, degraded, and subjected to dehumanizing language in Israel’s official rhetoric. Israeli Defense Minister Yoav Gallant infamously declared as early as October 10, 2023, “We are fighting human animals, and we are acting accordingly.”

This degrading language is not new, though. Arabs are like “drugged cockroaches in a bottle,” former Israeli army chief of staff Gen. Rafael Eitan reportedly declared during an Israeli parliamentary committee hearing in 1983. Such comparisons can engender a primal impulse toward violence, similar to the visceral reaction when confronting a vermin infestation, prompting efforts to eradicate the perceived threat. From cockroaches to the “microbes” labeling the Khmer Rouge’s opponents, genocidal history is replete with dehumanizing language that incites violent expulsion and complete annihilation of those targeted by such rhetoric.

Recognizing the colonial origins of these violent dynamics is essential, as emphasized by physicians Ghassan Abu Sitta and Rupa Marya in their article “The Deep Medicine of Rehumanizing Palestinians”. Naming, labeling, and categorizing are integral to the process of ‘othering’, which normalizes violence, particularly when the latter becomes metaphorically associated with eradicating a virus or infection. Palestinians have not only been labeled as “human animals.” They have also been portrayed as immune to trauma, depicted as individuals “whose bodies are not modern or civilized enough to experience trauma,” accustomed to living in a “culture of death,” as Lamia Moghnieh observes. Palestinian trauma is often dismissed, even when extensively documented in a vast array of visual imagery exposing settler violence, land encroachments, and various forms of abuse endured by the local population on a daily basis. However, as Antony Loewenstein asserts in his meticulously researched work, “The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World,” the media visibility of Israeli atrocities against Palestinians fails to resonate with those who do not perceive Palestinians as human beings but, rather, as “a racial group who deserve punishment and death.”

Using tech tools to further enhance this dehumanization process is a trajectory that began long before the current conflict, with Israel extensively employing AI-powered tools for security purposes in the Occupied West Bank since 2018. For instance, a facial recognition system named Red Wolf was utilized to unlawfully collect and categorize biometric data of the Palestinian population, enabling the monitoring, control, and restriction of their movements. This surveillance tool, disguised in the familiar and user-friendly form of a mobile app, has been utilized by the Israeli military at checkpoints, frequently without the knowledge of Palestinian citizens waiting in line. This practice aligns with what Amnesty International has characterized as “automated apartheid.”

Rehumanizing Through Technology

By contrast, Israeli companies have employed AI technology to (re)humanize Israeli civilians held captive by Hamas, aiming to garner support for their cause on the global stage. In a touching video translated into multiple languages, a 3-year-old blonde child is seen calling out for her mother. “Why are we here?” she asks, her eyes fixed directly on the camera. “I want to go home,” she adds, her gaze appealing directly to the audience. “I miss my friends,” a young boy’s voice echoes. “I miss playing soccer, Mom. Do my friends even think about me?”. The closing scene is especially poignant: “I keep telling myself it’s just a story,” he says, “but it’s such a bad one. Who would write this kind of story?” he finally asks, gazing into the camera with his innocent yet sorrowful eyes. “Someone in Gaza needs to talk to you,” the opening caption of both videos reads. Featured are Yael and Nave Shoham, both abducted by Hamas on October 7, 2023, and held captive since. The closing credits underline, “To hear their voices, we needed to use AI,” subtly suggesting that the only voices considered legitimate from Gaza are those of the Israeli hostages. This leaves the local Palestinian population voiceless, entirely omitted from the narrative.

Shiran Mlamdovsky Somech, the founder of Generative AI for Good, the Israeli company behind the ‘Be their voice’ campaign, emphasizes the potential of machine learning technology “to effect real social change”. In particular, she asserts that generative AI “can foster empathy and offer human-like experiences…to improve and even save lives.”  The company’s website showcases other initiatives aligned with this approach, such as “Listen to my voice,” where AI-powered voices of victims’ of domestic violence advocate for societal change from beyond the grave; or “March of the Living,” using AI to bring Holocaust survivors back to life, allowing them to share their stories of resistance against Nazism to inspire future generations.

Here, the blending of archival footage with synthetic imagery breathes new life into historical documentation, infusing the narrative with a contemporary resonance. The voices of the Holocaust survivors, referred to proudly as ‘fighters’ on the company’s website, are imbued with a tone that emphasizes their acts of resistance. Their direct gaze into the camera and the emphasis on their resistance actions all serve to evoke a deeper sense of empathy and inspire feelings of humanity and dignity in their endeavors. Additionally, the inclusion of a female fighter in the video further strengthens this narrative, indirectly enhancing the portrayal of Israel as a bastion of justice and equality. This parallels the IDF’s social media strategy that, before the current war on Gaza, sought to associate the government’s image with (perceived to be) Western ideals, such as feminism and LGBTQIA+ inclusivity.

The “Be Their Voice” campaign fits within this ideological structure, both through its visual style and the message it conveys. It urges the international community to elevate the voices of the silenced, particularly the children held captive by Hamas. Unlike the Gospel and its inner dehumanizing -and killing- mechanism, here AI is utilized as an inventive platform to nurture empathy among viewers. The direct appeal from the AI-generated child-like eyes seeks to evoke compassion, with the goal of rallying public support for their plight.

The only voices we hear from Gaza are these AI made Israeli children, further removing Palestinians from the narrative and implicitly endorsing the view that the area is solely under Hamas’ control, portrayed as a domain of death and violence, empty of both people and humanity. The genuine voices of Gaza’s population are strikingly missing, even in the synthetic realm of this seemingly benevolent generative AI ‘for good’.

Postscriptum

I intended to test AI biases on this topic. When inputting this text for proofreading, ChatGPT categorized it under the label ‘AI empowering Gaza’. Additionally, it suggested replacing expressions like ‘the killing of humans’ with ‘human casualties’, and ‘annihilation of civilian facilities and civilian bodies’ with ‘loss of civilian lives’.

Donatella Della Ratta

Donatella Della Ratta teaches Communications and Media Studies at John Cabot University in Rome. She holds a PhD from the University of Copenhagen and is a former Affiliate of the Berkman Klein Center for Internet and Society at Harvard University. She is the author of Shooting a Revolution: Visual Media and Warfare in Syria (Pluto Press, 2018).

This article was first published at UntoldMag here: https://untoldmag.org/category/dossiers/a-i-from-a-counter-hegemonic-perspective/

Share