Israeli media claims the IDF uses AI to choose targets for its strikes

Israel, which contests, assures that the program concerned only serves to compile information. The UN Secretary General expressed his concern.

Published


Reading time: 2 mins

Palestinians walk among the rubble of buildings destroyed by Israeli bombings in Khan Younes, in the Gaza Strip, March 7, 2024. (AFP)

How does the Israeli army choose the targets for its bombings in the Gaza Strip? Two independent Israeli media, +972 and Local Call, claim that it uses a program based on artificial intelligence, called Lavender, in an investigation published Wednesday April 3. Israel claims that it is a simple database cross-referencing the information collected by its services. On Friday, UN Secretary-General Antonio Guterres said he “deeply disturbed” by these assertions from this investigation.

The online media outlet +972, which cites six anonymous sources within Israeli intelligence, describes Lavender as “a program based on artificial intelligence”designed to identify all people suspected of belonging to the military branches of Hamas and Islamic Jihad and mark them as “potential targets to bomb”. The Israeli site claims that 37,000 people have been reported in this way.

According to testimonies collected by +972, the Israeli army relied heavily on this system, particularly at the start of its military intervention. “I spent 20 seconds on each target” posted by Lavender, explains one of the officers interviewed, who said he had “zero added value as a human, except to validate without discussion”. This while the system gave erroneous analyzes in nearly 10% of cases, according to +972.

Israel denies, the UN is worried

Still according to these media, Lavender is different from “Evangile”, another algorithm used by the Israeli army to recommend buildings and structures to target (and not people), in order to “produce targets at a rapid rate” according to an Israeli military spokesperson. A previous survey by +972 and Local Call already presented this system as a “factory of mass murder” favoring the “quantity over quality”.

Israel did not deny the existence of Lavender, but denied the media’s presentation of it. [Lavender] is not a ‘system’, but simply a database used to cross-reference intelligence sources, to produce up-to-date information on military members of terrorist organizations” the Israeli army said in a statement published by the Guardian.

The army also ensures that it respects international law, “targets only military targets and soldiers”and “analysts must conduct independent reviews” before carrying out a strike to ensure its proportionality and collateral risks.

For his part, UN Secretary General Antonio Guterres said he “deeply disturbed” by these statements. “No portion of life or death decisions that impact entire families should be delegated to the cold calculation of algorithms”insisted the manager.


source site-29