The role of the Lavender system in the bombing of the Gaza Strip

  Articoli (Articles)
  Veronica Grazzi
  25 April 2024
  5 minutes, 3 seconds

Translated by Angela Tagliafierro


The Lavender system is an artificial intelligence (AI) powered database created to identify suspected members of Hamas and Palestinian Islamic Jihad (JIP) as possible targets for bombing. The investigation conducted by +972 and Local Call collects the testimonies of six Israeli intelligence officers who explain how Lavender was used especially in the first weeks of the war to speed up and systematise the targeting process. It is estimated that the system identified 37,000 Palestinians as suspected Hamas-linked militants and designated their homes as targets for air strikes.

How the artificial intelligence system used by Israel works

Lavender was developed by the elite division of the Israeli Defence Forces (IDF), Unit 8200, in order to improve attack strategies on the population in the Gaza Strip. Before Lavender was deployed, the selection of a human target (a high-level military man or commander) was a complex process: it was necessary to check that the person was indeed a high-level member of the Hamas military wing, to find out where he lived and his contacts entirely by hand.

However, after 7 October, the army decided to designate all operatives of the Hamas military wing as possible human targets, regardless of their level of importance. It solved the technical problem of identifying a high number of targets in a short time with Lavender, which was used to systematise the detection of individuals who could be considered Hamas and JIP militants.

The Lavender software analyses information gathered from the mass surveillance system active in the Strip and assigns a score from 1 to 100 to almost every single Palestinian, assessing the probability rate that they are militants. Lavender was trained to distinguish the characteristics of Hamas and PIJ operatives, then learnt to find the same characteristics in the general population. A person with multiple related characteristics receives a high score and is automatically a potential target.

The result was that the role of human personnel in incriminating Palestinians as military agents was sidelined. Artificial intelligence did most of the work, sometimes making mistakes and even identifying as targets police workers, relatives of militants, or residents who had a name and nickname identical to those of the militants.

Violations of International Humanitarian Law

Israeli army sources explained the functioning of the 'pre-authorised tolerances' on the number of civilians that could be killed in an attack against the targets identified by Lavender. For example, especially in the first weeks of the war, 15 or 20 civilians were allowed to be killed for low-ranking militants, while the limit was raised to 100 for high-ranking militants.

The investigation sources reported that the system had achieved 90% accuracy. Although there was a 10% margin of error, the system was approved for be employed on a large scale in the Strip. If Lavender decided that someone was a Hamas militant, Israeli intelligence officers acted without any obligation to independently verify the system's conclusion or examine the raw intelligence data on which it was based. One source stated that the personnel simply did what the system said, and usually only personally spent ‘20 seconds’ on each target before authorising a bombing, just to make sure the target was a male.

Moreover, when it came to striking allegedly lower-ranking militants, the army preferred to use only unguided bombs, called 'dumb' bombs, which can destroy entire buildings and cause a significant number of casualties. The reason is the lack of smart bombs (precision bombs), which, according to Israel's official statements, should only be used when Hamas forces the civilian population to act as human shields in order to conduct precise attacks and safeguard civilians.

The sources explain how the Lavender system is also associated with Gospel, another AI system used to locate buildings to be bombed once Hamas operatives are identified. Since every inhabitant in Gaza had a private home associated with their person, the army's surveillance systems could easily and automatically 'link' people to their homes, pinpointing their location much more easily than when they were on a military mission.

All this reflects the level of surveillance and control to which the population of Gaza is subjected. Automating such a delicate task as the selection of human targets results in more innocent civilian casualties due to a false positive, which in a probabilistic system like Lavender can happen in 10 per cent of cases. Furthermore, the limited human control and 'pre-clearance' to accept a number of collateral civilian casualties is sufficient evidence to constitute a war crime and a very serious violation of International Humanitarian Law. The Lavender system helps to explain why there are so many civilian deaths in Gaza and, according to Ben Saul, UN Special Rapporteur on Human Rights and Counter-Terrorism, if the details of the report prove true, ‘many Israeli attacks in Gaza would constitute the war crime of launching disproportionate strikes’.

Israel's statements

Artificial intelligence has been used as a mean to make decisions made by a human. In an article in The Guardian, an Israeli intelligence officer explains that he has 'more confidence in a static mechanism than in a grieving soldier'. Most of the testimonies speak in favour of the system, describing it as objective, not subject to human emotions and effective in optimising time in the targeting phase.

In response to the publication of the testimonies on +972 and Local Call, the IDF stated in a statement that its operations were conducted in accordance with the rules of proportionality under international law. The Israeli army also denied the idea that the technology is a 'system', but rather 'a simple database' whose purpose is to cross-reference intelligence sources in order to produce information on the military operators of terrorist organisations.

Mondo Internazionale APS - Reproduction Reserved ® 2024

Share the post

L'Autore

Veronica Grazzi

Veronica Grazzi è originaria di un piccolo paese vicino a Trento, Trentino Alto-Adige ed è nata il 10 dicembre 1999.

Si è laureata in scienze internazionali e diplomatiche all’università di Bologna, ed è durante questo periodo che si è appassionata al mondo della scrittura grazie ad un tirocinio presso la testata giornalistica Il Post di Milano. Si è poi iscritta ad una Laurea Magistrale in inglese in Studi Europei ed Internazionali presso la scuola di Studi Internazionali dell’Università di Trento.

Grazie al Progetto Erasmus+ ha vissuto sei mesi in Estonia, dove ha focalizzato i suoi studi sulla relazione tra diritti umani e tecnologia. Si è poi spostata in Ungheria per svolgere un tirocinio presso l’ambasciata d’Italia a Budapest nell’ambito del bando MAECI-CRUI, dove si è appassionata ulteriormente alla politica europea ed alle politiche di confine.

Veronica si trova ora a Vienna, dove sta svolgendo un tirocinio presso l’Agenzia specializzata ONU per lo Sviluppo Industriale Sostenibile. È in questo contesto che ha sviluppato il suo interesse per l’area di aiuti umanitari e diritti umani, prendendo poi parte a varie opportunità di formazione nell’ambito.

In Mondo Internazionale Post, Veronica è un'Autrice per l’area tematica di Diritti Umani.

Categories

Diritti Umani

Tag

Gaza #HumanitarianCrisis artificial intelligence Lavender