Israeli Military Utilizes Artificial Intelligence for Target Selection in Gaza, Reveals Report

Israeli Military Utilizes Artificial Intelligence for Target Selection in Gaza, Reveals Report

An investigative report by +972 Magazine and Local Call unveils the Israeli military's utilization of artificial intelligence for identifying bombing targets in Gaza. Allegations from six Israeli intelligence officials suggest a program where human review of suggested targets was minimal at best.

The Israeli military has been utilizing artificial intelligence to assist in pinpointing bombing targets in Gaza. According to an investigation by +972 Magazine and Local Call, six Israeli intelligence officials who were part of the alleged program revealed that the human review of the suggested targets was minimal at best.

These officials, as reported in a comprehensive investigation by the online publication that includes both Palestinians and Israelis, disclosed that the AI-powered tool was named "Lavender" and had an error rate of 10%.

When asked about +972 Magazine’s report, the Israel Defence Forces (IDF) acknowledged the existence of the tool but clarified that they are not using artificial intelligence (AI) to pinpoint suspected terrorists. Instead, they stated that these information systems are tools to assist analysts in the process of target identification. The IDF also highlighted their commitment to minimizing harm to civilians in the operational circumstances during strikes.

Furthermore, the IDF explained that analysts are required to independently verify that the identified targets align with the definitions outlined in international law and adhere to the additional restrictions specified in the IDF directives.

One official from +972 mentioned that human personnel often just approved the decisions made by machines, acting as a "rubber stamp." They would usually spend only about 20 seconds on each target, making sure they were male, before giving the green light for a bombing.


Reuters

video

Related video

José Andrés says IDF strike on World Central Kitchen vehicles was no accident

Amid increasing global attention on Israel’s military actions, an investigation has been launched following air strikes that killed foreign aid workers in Gaza. The Gaza Ministry of Health reports over 32,916 deaths due to Israel’s siege, causing a severe humanitarian crisis with nearly three-quarters of the population in northern Gaza facing extreme hunger, as stated in a United Nations report.

Yuval Abraham, the author of the investigation, had previously informed CNN about the Israeli military's heavy reliance on artificial intelligence to identify targets for assassinations, with minimal human oversight.

The Israeli military clarified on Wednesday that they do not utilize an artificial intelligence system to identify terrorist operatives or predict if someone is a terrorist. Instead, their analysts utilize a database to cross-reference intelligence sources and create current information on military operatives of terrorist groups.

After the information is gathered, human officers are tasked with confirming that the identified targets align with the definitions outlined in international law and the restrictions specified in IDF directives. This verification process was also detailed by +972.

Night attacks

The magazine also reported that the Israeli army “systematically attacked” targets in their homes, usually at night when entire families were present.

According to sources, the outcome was devastating for thousands of Palestinians, mainly women, children, and non-combatants, who were killed by Israeli airstrikes. The report highlighted that this was particularly evident in the initial weeks of the conflict due to decisions made by the AI program.

The report also mentioned that when targeting suspected junior militants, the military chose to use unguided missiles known as dumb bombs, which have the potential to cause extensive destruction.

Palestinians are seen checking the destruction caused to a residential building in the Maghazi refugee camp in the central Gaza Strip on Friday, March 29, 2024.

CNN reported in December that almost half of the 29,000 air-to-surface munitions used in Gaza last fall were dumb bombs. These bombs can be more dangerous to civilians, especially in crowded areas like Gaza.

The IDF stated that it avoids carrying out strikes when the potential harm to civilians outweighs the military advantage. The IDF also strives to minimize harm to civilians as much as possible in the specific operational conditions.

Furthermore, the IDF mentioned that they carefully assess targets before launching strikes and select the appropriate munition based on operational and humanitarian factors. This includes considering the structural and geographical aspects of the target, the surrounding environment, potential impact on nearby civilians, critical infrastructure nearby, and other relevant factors.

Israeli officials have consistently stated that the use of heavy weapons is essential in order to defeat Hamas. This group has been responsible for the deaths of over 1,200 individuals in Israel and the capture of numerous hostages on October 7, which initiated the current conflict.

This report includes contributions from CNN’s Vasco Cotovio and Mohammed Tawfeeq.

Editor's P/S:

The reliance on AI in the Israeli military's targeting process raises grave concerns about the potential for civilian casualties. The minimal human oversight and the