Israel has been utilizing artificial intelligence technologies in its genocidal campaign in Gaza to murder Palestinians, employing both Israeli and American technology companies to do so. However, there is a long history of surveillance practices across Occupied Palestine that have led us to this moment.
Artificial intelligence (AI) is broadly defined as machines capable of performing tasks that typically require human intelligence. Decisions can be made and algorithms can analyze surveillance data much faster than humanly possible. Antony Lowenstein, author of The Palestine Laboratory, said during the annual Palestine Digital Activism Forum in early June: “I fear in years to come it’s [artificial intelligence that is] going to be blamed. It wasn’t us, it was the machine. No… a machine itself was not making the decision to press the button to bomb a house. It was made by a human. An Israeli.”
Gaza
Reports on the ‘Lavender,’ ‘Gospel,’ and ‘Where’s Daddy’ AI-enabled data processing systems developed and in use by the Israeli Occupation Forces (IOF) in their genocidal campaign against Gaza have caught widespread attention, prompting journalists to call Gaza the site of the first AI-powered genocide. AI technology was reportedly first used in Gaza during Israel’s 11-day assault in 2021. During the ongoing genocide, for the first time, it is being used to kill Palestinians at an unprecedented level and at much faster rates. These three known systems identify “targets” for airstrikes based on Israeli mass surveillance records of Palestinians in Gaza that have been collected for years by the IOF under the racist framework of monitoring what they deem as “threats” to the Israeli regime. The ‘Gospel’ “recommends” buildings and structures to strike, while the ‘Lavender’ and ‘Where’s Daddy’ systems “recommend” people to kill and track their location to determine when a strike should be administered. These “recommendations” are approved for airstrikes on densely populated civilian urban areas by the Israeli military with practically no review. A few Israeli intelligence agents shared with +972 Magazine that they “personally” only take 20 seconds to review and approve the airstrike recommendation, using that time only to confirm if the “target” is a male. It is unclear if this is actual policy. In August, however, the UN High Commissioner for Human Rights released a statement revealing that the majority of those killed in Gaza are women and children.
Israel isn’t trying to ethnically cleanse one particular gender or age group: it is targeting all Palestinians. The false narrative and justifications surrounding the functionality and “precision” of these AI-powered military technologies are intended to pacify and “wow” the colonial Western world. This was clear when the terrorist Israeli pager attacks maimed, blinded, and killed thousands of people in Lebanon, while countless journalists from legacy Western outlets praised Israel’s technological prowess.
These AI systems collect, search, and process data acquired by existing Israeli surveillance of Palestinians on a mass scale. Google Photos facial recognition, membership of WhatsApp groups, social media connections, phone contacts, and cellular information, and more are analyzed by the AI system to create a kill list. Then, Israeli soldiers choose to approve it to bomb entire families and neighborhoods. They do it again and again, at a much faster rate than previously possible. Because of advancements in weapons technology, these tools are supposedly extremely “precise,” in that the IOF knows exactly who they are designating to murder. In practice, however, they are deliberately targeting the Palestinian people as a whole for mass murder. The AI system can spit out targets much more rapidly than humans can and the Israeli military has so much ammunition (primarily provided by the U.S.) that they have enacted a killing spree on Palestinians in Gaza. The Israeli military chooses exactly who to kill, knowing that entire neighborhoods will be reduced to dust.
For decades, Israel has held control over telecommunications infrastructure used by Palestinians in Gaza (and across Occupied Palestine), including phone lines and mobile and internet networks. Palestinian media studies scholar Helga Tawil-Souri calls this control a “digital occupation,” because Palestinian telecommunications are surveilled, slowed down, and shut down by Israel, toward colonial and genocidal ends. As of 2021, an Israeli intelligence source from the Israeli intelligence unit 8200 told Middle East Eye that Israel can listen to every telephone conversation taking place in the West Bank and Gaza. This information likely feeds into the AI-enabled killing machines being used for the genocide in Gaza.
According to anonymous Israeli intelligence agents who spoke to +972 Magazine, “Nothing happens by accident…We know exactly how much collateral damage there is in every home.” Israeli surveillance calculates and estimates how many civilians are going to be killed to assassinate an alleged Hamas or Palestinian Islamic Jihad member before approving airstrike “targets.” As an Israeli soldier told +972 Magazine, “The IDF bombed them in homes without hesitation, as a first option. It’s much easier to bomb a family’s home. The system is built to look for them in these situations.” AI and surveillance scholar Sophia Goodfriend emphasizes Israeli responsibility for these decisions in +972 Magazine, writing, “Israel is not relying on fully autonomous weapons in the current war on Gaza; rather, intelligence units use AI-powered targeting systems to rank civilians and civilian infrastructure according to their likelihood of being affiliated with militant organizations. This rapidly accelerates and expands the process by which the army chooses who to kill, generating more targets in one day than human personnel can produce in an entire year.” When describing the ‘Where’s Daddy’ system, an anonymous source told +972 Magazine, “You put hundreds [of people] into the system and wait to see who you can kill.”
Amazon Web Services is confirmed to be providing cloud services and servers that are being used to store massive amounts of surveillance data on Palestinians in Gaza. This surveillance data is the input for the AI murder “recommendation” systems described above. Amazon and Google signed a $1.2 billion “Project Nimbus” cloud technology contract with the Israeli government in April of 2021, which is being put to use in the ongoing genocide in Gaza with significant increases in the purchasing of data storage and AI services from these companies through the Nimbus contract. An Israeli military commander also shared publicly in July of 2024 that the military is currently using civilian cloud infrastructure from the companies Amazon, Google, and Microsoft to expand their genocidal military capacities in Gaza. The American data-mining company Palantir is also being used to operate AI systems for the IOF, as reported by The Nation. In January of 2024, the company entered a new “strategic partnership” to provide the Israeli regime with artificial intelligence systems to process surveillance data on Palestinians and target them in current “war-related missions,” which undoubtedly includes the genocidal military campaign in Gaza. In January, the Palantir Board of Directors also held its first meeting of the year in Israel, where Alex Karpe —co-founder and CEO of the company — signed the updated agreement with the Israeli Ministry of Defense directly after at their military headquarters. Palantir is an American company with deep ties to U.S. “counterterrorism,” policing, and military operations. Its CEO boasts that, “We are very well known in Israel. Israel appreciates our product…I am one of the very few CEOs that’s publicly pro-Israel.”
Surveillance is Nothing New
Surveillance, a term that has entered the public lexicon today with its digital manifestations explained above, is nothing new. It is a system designed to monitor, control, and dispossess people, with or without digital technology. Under the British Occupation of Palestine before the Nakba and then under the Israeli Occupation, Palestinians endured “low-tech” surveillance through mandatory identity cards, population registries, checkpoints, curfews, watchtowers, artificial borders, imprisonment, walls, and more.
Surveillance has its roots in colonial enterprises: across the world, the colonized are very familiar with being watched with the aim of control and dispossession, and many biometric surveillance practices were developed and used in colonial contexts. As Palestinian scholar Fayez Saygeh wrote in 1965, “For the Zionist settler-state, to be is to prepare and strive for territorial expansion.” This is aided by surveillance technology both by oppressing Palestinians and expanding the reach of what foremost Palestinian surveillance scholar Elia Zureik called the “Israeli gaze” on Palestinian life. Surveillance is rarely just watching: it is a tool to enable movement control, physical violence, forced movement, and genocide. Zureik emphasized that surveillance is a tool of power “intimately linked to the process of Othering, whereby the colonizer’s self-affirmation and identity construction are configured based on stigmatizing and denigrating the identity of the Other, the colonized.” Palestinians are meant to feel constantly watched and surveilled as a tool of Zionist settler-colonialism. This is a racializing phenomenon, meaning that Palestinians are ascribed inferior racial identities compared to Israeli settlers, who are meant to oppress and colonize them through the practice of surveillance.
Today’s high-tech AI-enabled surveillance regime enacts these same goals, but with even more unjustly collected data and an increased technological capacity to enact violence against more Palestinians more quickly through artificial intelligence tools. Today’s surveillance relies on the “Big Data” ecosystem, meaning the collection of vast amounts of information about occupied Palestinians towards genocidal colonial aims of dispossession. These huge AI-enabled data processing systems such as ‘Lavender’ are a direct reflection of today’s “Big Data” era. Governments, enabled by private technology companies, seek to extract and scrape as much data as possible about a population in order to oppress it. Palestinian surveillance studies scholar Mira Yaseen connects the big data paradigm to its colonial history. She tells Palestine Square that “Surveillance studies often overlook the colonial roots of surveillance, and when they acknowledge them, they seem to not give much attention to the big data paradigm, which is reconfiguring surveillance today.” She notes that the Israeli regime is “increasingly incorporating home-grown big data technologies in population management,” which could not be clearer through the use of ‘Lavender,’ ‘Where’s Daddy,’ ‘Gospel,’ and many other AI-enabled technologies in use across Occupied Palestine. With unlimited access to huge amounts of surveillance data and the rapidly expanded ability to analyze it towards oppressive ends, Israel has an unprecedented ability to commit genocide and forced displacement in Palestine, which we are witnessing right now in Gaza.
In a panel on “AI in Times of War: Gaza, Automated Warfare, Surveillance, and the Battle of Narratives” at the Palestine Digital Activism Forum, scholar Matt Mahmoudi discussed how Palestinians were unknowingly being subjected to facial recognition technology by the Israeli military at checkpoints along Salah Al-Din Road in Gaza. This road was designated a “safe route” by the IOF in the early months of genocide for Palestinians to travel along with everything they could carry, sometimes barefoot, with martyred bodies along the road, in a process of forced displacement from the North to the South of the Strip. This facial recognition technology — created by the notorious Israeli cyber-intelligence Unit 8200 — is operated partly through technology from an Israeli private company called Corsight and the American company Alphabet via Google Photos. These tools help analyze faces from grainy photos and crowded spaces. Displaced Palestinian men and young boys who were “identified” and designated as Hamas militants by the technology are then taken hostage by Israelis and detained in secret locations in Gaza and the Al-Naqab (Negev) desert, being subjected to horrific torture, including sexual violence, medical torture, and murder.
Experimental weapons testing is saturated in this colonial context. Many Israeli surveillance technologies are first used in Gaza and the West Bank, where Israel’s military rule and genocidal disregard for Palestinian life allow private companies to prototype and refine their products through contracts with the IOF, before exporting them abroad. Palestinian legal scholar Samera Esmeir commented on the eight-day-long deadly Israeli attack on Gaza in 2012, which killed over 165 Palestinians, including 42 children, while making the siege – enacted since 2006 – even more constrictive. She wrote that “the transformation of Gaza into a laboratory for colonial and imperial hegemony in the region is made in Israel.” This assault was the first documented use of the Iron Dome anti-missile technology that is provided by the United States. Elias Zureik wrote extensively on the experimental nature of the surveillance of Palestinians, building on Aimé Césaire’s idea of the “boomerang effect” to argue that “beyond the circulation of technologies and strategies of surveillance from one colonial space to another, those methods adopted to monitor marginal and minority groups perceived to threaten the state are eventually extended to the majority, and those developed in the colonies make their way back to the metropole.” We see this in colonial surveillance tools like Pegasus, created by the Israeli NSO Group, that are now in use globally to hack journalists, human rights defenders, and activists.
West Bank
In the occupied city of al-Khalil (Hebron) in the Occupied West Bank, where Israeli settlers have taken over parts of the city — particularly the Old City and its surroundings, which are under full Israeli military rule — AI-enabled control of Palestinians is ubiquitous. Closed-circuit television CCTV cameras — enabled with facial recognition technology — face the homes of Palestinians in al-Khalil, while some are forcibly installed on the roofs of Palestinian homes. These AI technologies expand the reach and violence of already existing surveillance infrastructures, such as Israeli checkpoints, watch towers, identity cards and permits, and segregated roads.
According to The Washington Post, in 2020, the Israeli military rolled out a new surveillance program where Israeli soldiers were directed and incentivized to forcibly collect as many images of Palestinians in Occupied al-Khalil as possible to construct a database of Palestinians that a former soldier called a “Facebook for Palestinians.” A military-run smartphone technology called Blue Wolf is used to conduct this surveillance, and the captured photos are then matched to a database of images. The app indicates whether the person should be allowed to pass, detained, or arrested immediately, using the colors red, yellow, or green, allowing Israeli soldiers to terrorize Palestinians with impunity. Blue Wolf is reportedly a smartphone-compatible version of a larger military-run database called Wolf Pack, which stores images, family histories, security ratings, license plate information, permit information, and education of Palestinians all across the West Bank. The purpose of these military tools is to collect and manage information about every Palestinian in the West Bank, in order to maintain and expand Israeli settler-colonial control.
In 2023, Amnesty International released a report titled “Automated Apartheid” that expanded on previous knowledge of these AI-enabled technologies to document a surveillance technology titled Red Wolf in use in al-Khalil and Jerusalem. It is allegedly linked to Blue Wolf and Wolf Pack. Red Wolf operates at IOF checkpoints within Occupied al-Khalil (Hebron), where it scans and registers the faces of Palestinians using facial recognition technology, incorporating faces into extensive surveillance databases without their knowledge or consent, which is not possible with theBlue Wolf smartphone application. Red Wolf, which uses facial recognition mechanisms, is deployed at military checkpoints; the faces of Palestinians crossing are recorded automatically without their knowledge or consent as opposed to an IOF soldier taking their picture to run through a database. This program is used to determine if a Palestinian should be denied passage or detained at the checkpoint. Scholars such as Dr. Rema Hammami have written extensively about how Israeli checkpoints — which control the movement of Palestinians throughout Occupied Palestine — function as “settler colonial technologies” to facilitate the “biopolitical management of native populations [that] are fundamentally driven by the necropolitical logics of elimination.”
In Occupied Jerusalem, AI-enabled surveillance is everywhere. Amnesty International has documented thousands of CCTV cameras in use across the Old City that are part of a video surveillance system called “Mabat 2000,” which is controlled by the IOF, equipped with facial recognition technology, and operational in Occupied East Jerusalem. This system launched in 2000 and was upgraded in 2018 to include biometric surveillance capabilities, such as object and facial recognition. Amnesty International found that there are one to two CCTV cameras located every five meters within the geography of Occupied East Jerusalem, and Euro-Med Monitor reports that CCTV cameras cover 95% of that area. In the neighborhoods of Sheikh Jarrah and Silwan, the number of CCTV cameras has expanded rapidly over the past three years as the Israeli government seeks to forcibly evict Palestinians from their homes for settlers to take over their property. A 2021 report by the Palestinian digital rights organization 7amleh included interviews with Palestinians in Occupied Jerusalem and documented common sentiments of feeling constantly surveilled, even inside their homes. They reported that such surveillance was done by CCTV cameras as well as via social media monitoring by the IOF, specifically the Israeli Cyber Unit, which was created in 2015. It works with social media companies to remove Palestinian content.
In 2021, a young Palestinian from Silwan told 7amleh that, along with physical siege and surveillance, “we are under electronic siege here, too.” This electronic siege has only tightened during the ongoing genocide in Gaza, which has led to escalations in arrests of Palestinians across Occupied Palestine for making social media posts about the genocide. Automated AI-powered content moderation algorithms disproportionately remove Palestinian content globally, algorithms that Jillian York called a “black box” while speaking at the PDAF hosted by 7amleh in early June of 2024. Palestinian digital rights expert Mona Shtaya documents that, during the ongoing genocide in Gaza, Meta “actively suppresses Palestinian content that seeks to document human rights violations in Palestine, such as recorded evidence pertaining to the al-Ahli Arab Hospital bombing” and egregiously and disproportionately shadowbans content about Palestine.
Generative AI — a rapidly emerging and expanding technology that uses training data to prompt AI-generated outputs such as images, text, conversations, and more — has also been discriminatory against Palestinian content. In a panel titled “Generative AI: Dehumanization & Bias Against Palestinians & Marginalized Communities” at the PDAF, Shtaya shared that AI-generated outputs by chatbots — such as ChatGPT or WhatsApp AI-generated stickers — “manipulate public opinion” with negative content about Palestinians, fueling disinformation campaigns that enable genocide. Shtaya pressed that “social media and tech platforms are not investing enough resources to combat disinformation… to limit or prohibit tech harms being caused by generative AI content.”
Shahd Qannam and Jamal Abu Eisheh write in the Jerusalem Quarterly about how Israeli surveillance of Palestinians in Jerusalem in particular is key to the settler-colonial vision to eliminate Palestinians from the entirety of Jerusalem and colonized Palestine. They write, “The digital and online realms are such spaces that become sites of struggle between the eliminatory settler-colonial logic and Indigenous resistance to erasure. After all, it is territory that is central to settler colonialism, and insomuch as digital space is a territory, it is critical to examine Israel’s practices of domination of it.” By eliminating Palestinian voices from online spaces, Israel seeks to eliminate Palestinians altogether. Israeli companies have even created an algorithm used by the IOF to surveil and detain Palestinians based on their Facebook posts. This is a predictive algorithm, meaning that it searches for posts that supposedly “incite violence,” which includes pictures of martyrs or jailed Palestinians.
Qannam and Abu Eisheh also discuss how Israel rolled out the “Israeli Biometric Project” in 2009, a move to require digital identification cards to create a database with the biometric information of residents of Israel, including Palestinian residents of Occupied Jerusalem. These digital IDs have to be renewed with the Israeli Ministry of Interior (MoI) and can be revoked to deprive Palestinians of their residency in Jerusalem. Mass data collection is a means of surveillance and control that is used to escalate settler-colonial elimination of Palestinians.
The Deadly Exchange of Technology
The faulty U.S.-made AI-powered acoustic gunshot detection system called ShotSpotter – a “precision policing technology” — signed a contract with the Israeli police in 2021, revealing a transnational colonial desire to automate oppression. It functions to expand the reach of carceral, colonial control against Palestinians, both from the land and sky. This tool — originally created to surveil and control Black Americans — was also seen as useful to oppress Palestinians. ShotSpotter’s December 16, 2021, press release about the contract reads: “ShotSpotter is excited to partner with Airobotics to develop a new market and help Israeli law enforcement respond more quickly and precisely to incidents of gunfire,” said Ralph A. Clark, President and CEO of ShotSpotter.
In this highly militarized context, the addition of drones — through its partnership with the company Airobotics — to provide “critical visual information to first responders that are en route” to gunshot locations determined by the acoustic sensors on the ground is notable, due to the way it exacerbates the weaponization of space through visual surveillance.
The Israeli regime is the first in the world to allow large unmanned aerial vehicles to fly in civilian airspace, This was approved in 2022 by the Israeli Transportation Ministry when a drone created by the Israeli weapons manufacturer Elbit Systems was allowed to fly in civilian airspace. This move highlights the hyper-militarized, paranoid nature of Israeli colonial society. Airobotics markets itself as the first company globally to develop an “unmanned drone solution” that doesn’t require human pilots or manual oversight, which is particularly useful in Occupied Palestine.
ShotSpotter is set to be deployed in “urban Israeli areas,” a term that is vague and normalizes Israel’s ongoing colonization and expansion of settlements on the rubble of Palestinian neighborhoods in the occupied West Bank. There is no publicly available information about what cities ShotSpotter is used in.
The colonial Israeli economy is tied to AI-enabled violence
Race and technology scholar Ruha Benjamin writes that the containment of racialized and colonized groups enables technological innovation. Gaza is under complete siege and has been since 2006, which has turned the 25-mile-long strip into a concentration camp. The West Bank has been under military occupation for decades, and the surveillance, control, and systematic murder of occupied Palestinians there has increased significantly over the past year. At the same time, as of 2016, Israel has held the largest number of surveillance companies per capita in the world, with over 27 surveillance companies, including the notorious NSO Group that created the invasive Pegasus software and has exported over $6 billion in products and services as of 2014.
Because Israel is a highly militarized settler-colonial regime, surveillance is a key investment of both the state and the private sector. Unlike many other countries, Israel’s surveillance sector relies on both local demand and exports. In the early 2000s, during the telecommunications boom and the U.S.-led ‘War on Terror,’ Israel's intelligence community consulted with U.S. security experts and technology CEOs to significantly expand and privatize Israel's intelligence apparatus across the Occupied West Bank, East Jerusalem, and Gaza. Unit 8200 developed the ‘Lavender System,’ the facial-recognition program in use in checkpoints in Gaza, and likely in many more AI tools and algorithms. There are also examples of the exchange of carceral tactics between the United States and Israel, with U.S. police, military, and border patrol officers training in Israel. This is known as the “deadly exchange.” And, at the U.S.-Mexico border, Elbit’s surveillance systems are being deployed. The United Arab Emirates has also heavily invested billions of dollars in Israeli security technology companies and advanced weapons systems companies, particularly in artificial intelligence.
According to scholar Sophia Goodfriend, there has always been an inextricable relationship between the private tech sector and the IOF, which is highly beneficial to the colonial goals of the Israeli regime. Many Israeli surveillance technologies are first used in Gaza and the West Bank, where Israel’s military occupation and disregard for Palestinian life allow private companies to prototype and refine their products through contracts with the IOF, before exporting them abroad. Some American companies are also directly implicated in oppressing Palestinians (as indicated in this article), making them complicit in Palestinian dispossession. However, there is extremely little reporting on these AI tools and how they are used against Palestinians, particularly toward genocidal ends. The development and deployment of these tools in a genocide against Palestinians are logical outcomes of a Zionist settler-colonial project. We must name them as such.