Revelations about ‘Lavender’, the artificial intelligence system Israel used against Hamas: ‘It’s unprecedented’

Revelations about ‘Lavender’, the artificial intelligence system Israel used against Hamas: ‘It’s unprecedented’
Revelations about ‘Lavender’, the artificial intelligence system Israel used against Hamas: ‘It’s unprecedented’
--

Israeli intelligence sources reveal the use of the “Lavender” system in the Gaza war and claim that permission was given to kill civilians in pursuit of lower-ranking militants.

War in Israel PHOTO: Profimedia

The Israeli military’s bombing campaign in Gaza used a previously undisclosed artificial intelligence database that at one point identified 37,000 potential targets based on their apparent ties to Hamas, according to intelligence sources involved in the war , writes theguardian.com.

In addition to talking about the use of the artificial intelligence system, called Lavender, the intelligence sources claim that Israeli military officials allowed the killing of large numbers of Palestinian civilians, particularly in the early weeks and months of the conflict.

Their unusually candid testimony provides a rare insight into the first-hand experiences of Israeli intelligence officials who used machine learning systems to help identify targets during the six-month war.

Israel’s use of powerful artificial intelligence systems in its war against Hamas has entered uncharted territory for advanced warfare, raising a number of legal and moral questions and transforming the relationship between military personnel and machines.

This is unprecedented as far as I can remember,” stated an intelligence officer who used Lavender, adding that they had more confidence in a “statistical mechanism” than in a grieving soldier. “Everyone there, myself included, lost people on October 7th. The car did it coolly. And that made things easier.”

Another Lavender user questioned whether the role of humans in the selection process was significant. “I would invest 20 seconds per target at this stage and make dozens of such selections every day. I had zero added value as a human being other than a stamp of approval. I saved a lot of time.”

The testimony of the six intelligence officers, all involved in the use of artificial intelligence systems to identify Hamas and Palestinian Islamic Jihad (PIJ) targets during the war, was given to journalist Yuval Abraham for a report published by the Israeli-Palestinian publication +972 Magazine and by the Hebrew-language publication Local Call.

Their accounts were shared exclusively with The Guardian ahead of publication. All six said Lavender played a central role in the war, processing masses of data to quickly identify potential “junior” agents to target. Four of the sources said that at one point early in the war, Lavender listed up to 37,000 Palestinian men who were linked to the AI ​​system by Hamas or the PIJ.

Lavender was developed by the elite intelligence division of the Israel Defense Forces, Unit 8200, comparable to the US National Security Agency or Britain’s GCHQ.

Several sources described how, for certain categories of targets, the IDF applied pre-authorized quotas for the estimated number of civilians likely to be killed before a strike was authorized.

Two sources said that in the first weeks of the war, they were allowed to kill 15 or 20 civilians during airstrikes on lower-ranking militants. Attacks on such targets were usually carried out with unguided munitions known as “dumb bombs”, the sources said, destroying entire houses and killing all their occupants.

You don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there is a shortage of these bombs.” an intelligence officer said. Another officer said the main question they faced was whether the “collateral damage” on civilians warranted an attack.

“Because we usually carried out the stupid bomb attacks, and that meant literally dropping the whole house on its occupants. But even if an attack is avoided, you don’t care – you immediately move on to the next target. Because of the system, the targets it never ends. You’ve got another 36,000 waiting for you.”

According to conflict experts, if Israel used dummy bombs to bomb the homes of thousands of Palestinians who were linked, with the help of AI, to militant groups in Gaza, this could help explain the shockingly high death toll in the war.

The Health Ministry in the Hamas-ruled territory says 32,000 Palestinians have been killed in the conflict in the past six months. UN figures show that in the first month of the war alone, 1,340 families suffered multiple losses, with 312 families losing more than 10 members.

Responding to the publication of the testimonies in +972 and Local Call, the IDF said in a statement that its operations were carried out in accordance with the rules of proportionality laid down by international law. He stated that dummy bombs are “standard weaponry” that are used by FDI pilots in a manner that ensures “a high level of precision”.

The release described Lavender as a database used “to cross-reference information sources in order to produce updated layers of information on military operatives of terrorist organizations”. This is not a list of confirmed military agents that can be attacked.

“The IDF does not use an artificial intelligence system that identifies terrorist operatives or attempts to predict whether a person is a terrorist,” he added. “Information systems are only tools for analysts in the process of identifying targets.”

Lavender has created a database of tens of thousands of people

In previous military operations conducted by the FDI, producing human targets was often a more labor-intensive process. Several sources who have described to The Guardian the development of targets in previous wars stated that the decision to “incriminate” a person or to identify them as a legitimate target was discussed and then signed by legal counsel.

In the weeks and months after October 7, this pattern of approving strikes on human targets accelerated dramatically, according to the sources. As the IDF bombardment of Gaza intensified, they said, commanders demanded a continuous pipeline of targets.

“We were constantly being pressured, ‘Bring us more targets’. They were really yelling at us,” said one intelligence officer. “We were told: now we have to take out Hamas, whatever the cost. Bomb anything you can.”

To meet this demand, the IDF has come to rely heavily on Lavender to generate a database of individuals deemed to have the characteristics of a PIJ or Hamas militant.

Details about the specific types of data used to train Lavender’s algorithm or how the program reached its conclusions are not included in the reports published by +972 or Local Call. However, the sources stated that during the first weeks of the war, Unit 8200 refined Lavender’s algorithm and adjusted the search parameters.

After random sampling and cross-checking its predictions, the unit concluded that Lavender achieved a 90 percent accuracy rate, the sources said, prompting the IDF to approve its widespread use as a target recommendation tool.

Lavender created a database of tens of thousands of people who were marked as predominantly low-ranking members of Hamas’ military wing, they added. It was used alongside another AI-based decision support system called Gospel, which recommended buildings and structures as targets rather than people.

The accounts include first-hand accounts of how intelligence officers worked with Lavender and how the reach of his network could be adjusted. “At its peak, the system was able to generate 37,000 people as potential human targets,” said one of the sources. “But the numbers were changing all the time, because it depends on where you set the bar for what constitutes a Hamas agent.”

They added: “There were times when a Hamas agent was more broadly defined, and then the machine started bringing us all kinds of civil defense members, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really put the soldiers at risk.”

Before the war, the US and Israel estimated the number of members of Hamas’ military wing at around 25-30,000.

In the weeks following the October 7 Hamas-led attack on southern Israel, in which Palestinian militants killed nearly 1,200 Israelis and kidnapped about 240 people, the sources said there was a decision to treat Palestinian men who ties to the military wing of Hamas as potential targets, regardless of their rank or importance.

Targeting processes by the IDF during the most intense phase of the bombing were also relaxed, they said. “There was a completely permissive policy regarding the victims of the bombing operations,” a source said. “A policy so permissive that, in my opinion, it had an element of revenge”.

Another source, justifying the use of Lavender to help identify low-ranking targets, stated that “when it’s a low-ranking militant, you don’t want to invest personal and time in him”. They said that in wartime there is not enough time to “carefully incriminate each target”.

“So you’re willing to take the margin of error of using AI, risking collateral damage and civilian deaths, and risking an attack by mistake and living with it,” they added.

Regardless of the legal or moral justification for Israel’s bombing strategy, some of its intelligence officers now appear to be questioning the approach laid down by their commanders. “No one has thought about what will happen afterwards, when the war is over, or how it will be possible to live in Gaza”one of them said.

Another said that after the October 7 attacks by Hamas, the atmosphere in the FDI was “painful and vindictive.” “There was a dissonance: on the one hand, people here were frustrated that we weren’t attacking enough. On the other hand, you see at the end of the day that another thousand Gazans have died, most of them civilians.”

The article is in Romanian

Tags: Revelations Lavender artificial intelligence system Israel Hamas unprecedented

-

PREV Stoltenberg appeals to the US to help Ukraine: “It is also in their interest that Putin does not win the war”
NEXT The Israeli prime minister is undergoing hernia surgery, which he allegedly discovered while thousands of Israelis protested in Tel Aviv and in front of his home in Jerusalem