Israel used “Lavender” to identify 37,000 Hamas targets. First revelations about an AI system waging war in uncharted territory (The Guardian)

Israel used “Lavender” to identify 37,000 Hamas targets. First revelations about an AI system waging war in uncharted territory (The Guardian)
Israel used “Lavender” to identify 37,000 Hamas targets. First revelations about an AI system waging war in uncharted territory (The Guardian)
--

The Israeli military’s bombing campaign in the Gaza Strip relied on the use of an artificial intelligence system called Lavender, a previously undisclosed database that at one point identified 37,000 potential targets based on their apparent ties to Hamas, according to sources from the intelligence services involved in the war, reports The Guardian, quoted by News.ro.

In addition to talking about the use of the Lavender artificial intelligence system, the sources also claim that Israeli military officials allowed the killing of large numbers of Palestinian civilians, particularly in the early weeks and months of the conflict.

Their unusually candid testimony offers a rare insight into the first-hand experience of Israeli intelligence officials who used machine learning systems to help identify targets during the six-month war.

Israel’s use of powerful artificial intelligence systems in its war against Hamas represents uncharted territory in advanced warfare, raising a number of legal and moral questions and transforming the relationship between military personnel and machines, The Guardian adds.

“This is unprecedented, as far as I can remember,” said an intelligence officer who used Lavender, confessing that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The car made (selection – no) without feelings. And that made it easier”, explained the intelligence officer.

Another user of the Lavender system confirmed that the role of humans in the selection process is insignificant. “I invest 20 seconds per target at this stage and make dozens of such selections every day. I have zero added value as a human being other than being a stamp of approval. It saves a lot of time”, he confessed.

LAVENDER HAD A CENTRAL ROLE IN THE WAR

The testimonies of six intelligence officers, all involved in the use of artificial intelligence systems to identify Hamas and Palestinian Islamic Jihad (PIJ) targets during the war, were recorded by journalist Yuval Abraham for the Israeli-Palestinian publication +972 Magazine and the Hebrew language Local Call, mentions The Guardian. Their accounts were shared exclusively with The Guardian ahead of publication.

All six stated that Lavender played a central role in the war, processing large volumes of data to quickly identify potential “junior” agents to target. Four of the sources said that at one point early in the war, Lavender listed up to 37,000 Palestinian men whom the AI ​​system linked to Hamas or the PIJ.

Lavender was developed by the elite intelligence service division of the Israel Defense Forces, Unit 8200, comparable to the US National Security Agency or Britain’s GCHQ.

CIVILIANS KILLED BASED ON “QUOTAS”

Several sources described how, for certain categories of targets, the Israeli military applied pre-authorized quotas for the estimated number of civilians who could be killed before a strike was authorized.

Two sources said that in the first weeks of the war, they were allowed to kill 15 or 20 civilians during airstrikes on lower-ranking militants.

Attacks on such targets were usually carried out with the help of unguided munitions, known as “dumb bombs”, which were able to destroy entire houses and kill all their occupants, the sources said.

“You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there is a shortage (of such bombs),” said an intelligence officer. Another officer said the main question they faced was whether the “collateral damage” on civilians warranted an attack.

“Even if an attack is dodged, you don’t care – move on to the next target immediately. Because of the system, the targets never end. You still have 36,000 waiting for you,” said an officer.

According to conflict experts, if Israel used dummy bombs to blow up the homes of thousands of Palestinians who were linked, with the help of AI, to militant groups in Gaza, this could help explain the shockingly high death toll in the war . The Health Ministry in the Hamas-ruled territory says 32,000 Palestinians have been killed in the conflict in the past six months. UN figures show that in the first month of the war alone, 1,340 families suffered multiple losses, with 312 families losing more than 10 members.

WHAT THE ISRAELI ARMY SAYS

Responding to the publication of the testimonies in +972 and Local Call, the Israeli army (IDF) said in a statement that its operations were carried out in accordance with the rules of proportionality stipulated by international law. He stated that the dummy bombs are a “standard weapon” that are used by his pilots in a manner that ensures “a high level of accuracy”.

The army statement described Lavender as a database used “to cross-reference sources of information in order to produce up-to-date layers of information about military operatives of terrorist organizations”.

“The IDF does not use an artificial intelligence system that identifies terrorist agents or attempts to predict whether a person is a terrorist,” the Israeli military claims. “Computer systems are only tools for analysts in the target identification process,” the IDF added.

TENS OF THOUSANDS OF PEOPLE ARE IN “LAVENDER”

In previous military operations carried out by the IDF, identifying human targets was often a more labor-intensive process. The decision to “incriminate” a person or identify them as a legitimate target was discussed and then signed by a legal advisor.

In the weeks and months after October 7, this pattern of approving strikes on human targets accelerated dramatically, according to the sources. As the IDF’s bombardment of Gaza intensified, they said, commanders demanded a steady stream of targets. “We were constantly under pressure: Bring us more goals. They were really yelling at us,” said an intelligence officer. “We were told: now we have to take down Hamas, no matter what the cost. Anything you can, bomb,” the source also declared.

To meet this demand, the IDF has come to rely heavily on the Lavender system to generate a database of individuals deemed to have the characteristics of a PIJ or Hamas militant.

Details about the specific types of data used to train Lavender’s algorithm or how the program reached its conclusions are not included in the reports published by +972 or Local Call. However, sources stated that during the first weeks of the war, Unit 8200 refined Lavender’s algorithm and adjusted the search parameters.

After random sampling and cross-checking its predictions, the unit concluded Lavender achieved a 90 percent accuracy rate, the sources said, prompting the IDF to approve its widespread use as a target recommendation tool.

LAVENDER WITH GOSPEL

Lavender created a database of tens of thousands of people who were marked as predominantly low-ranking members of Hamas’ military wing. It was used alongside another AI-based decision support system called Gospel, which recommended buildings and structures as targets rather than people.

The accounts include credible testimony about how intelligence officers worked with Lavender and how the reach of his network could be adjusted. “At its peak, the system managed to generate 37,000 people as potential human targets,” said one of the sources. “But the numbers were changing all the time, because it depends on where you set the bar for what a Hamas operative means,” the source explained.

“There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defense members, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger the soldiers,” the source said.

Before the war, the US and Israel estimated the number of members of the military wing of Hamas at around 25-30,000 people.

One of the sources said that after the October 7 attacks by Hamas, the atmosphere in the IDF was “painful and vindictive.” “There was a dissonance: on the one hand, people here were frustrated that we weren’t attacking enough. On the other hand, you see at the end of the day that another thousand Gazans have died, most of them civilians,” the source said.

Regardless of the legal or moral justification for Israel’s bombing strategy, some of its intelligence officers now appear to be questioning the approach set by their commanders, The Guardian reports.


The article is in Romanian

Tags: Israel Lavender identify Hamas targets revelations system waging war uncharted territory Guardian

-

PREV Zelensky: Ukraine will become a NATO member only after winning the war
NEXT The Israeli prime minister is undergoing hernia surgery, which he allegedly discovered while thousands of Israelis protested in Tel Aviv and in front of his home in Jerusalem