NewsBite

Advertisement

This was published 8 months ago

Israel is reportedly using AI to pick Gaza targets to assassinate at their family home

By Augusta Saraiva

New York: United Nations Secretary-General Antonio Guterres said he was “deeply troubled” by reports that Israel used artificial intelligence (AI) to help identify bombing targets in Gaza, saying the practice put civilians at risk and blurs accountability.

In a story on April 3, the Tel Aviv-based +972 Magazine and Local Call reported that Israel was using an AI program called “Lavender” to come up with assassination targets – Palestinians suspected to be Hamas militants. The report, which Israel has disputed, said the AI system had played a “central role in the unprecedented bombing of Palestinians”.

Members of the Abu Draz family inspect their house after it was hit by an Israeli airstrike in Rafah on Thursday.

Members of the Abu Draz family inspect their house after it was hit by an Israeli airstrike in Rafah on Thursday.Credit: AP

“No part of life-and-death decisions which impact entire families should be delegated to the cold calculation of algorithms,” Guterres said. “AI should be used as a force for good to benefit the world, not to contribute to waging war on an industrial level, blurring accountability.”

The military use of AI has emerged as a growing concern as powers including the US and China rush to incorporate it into their armed forces.

The Israeli military has said it has relied on new technologies in its campaign to wipe out Hamas since the October 7 attacks by the proscribed terrorist group, in which 1200 people were killed and 250 abducted. Israel’s retaliation campaign since then has left more than 32,000 Palestinians dead, according to the Hamas-run health ministry.

In his article for +972 Magazine, Yuval Abraham quoted six Israeli intelligence officers – who have served in the army during the current war and had firsthand involvement with the use of AI to generate targets for assassination – as saying Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war.

Members of the Abu Draz family mourn their relatives killed in the Israeli bombardment of the Gaza Strip, at their house in Rafah.

Members of the Abu Draz family mourn their relatives killed in the Israeli bombardment of the Gaza Strip, at their house in Rafah.Credit: AP

“In fact, according to the sources, its influence on the military’s operations was such that they essentially treated the outputs of the AI machine as if it were a human decision,” he wrote.

Abraham reported that officers had approval to adopt the software’s kill lists, which were only “rubber-stamped” by humans.

Advertisement

“This was despite knowing that the system makes what are regarded as ‘errors’ in approximately 10 per cent of cases, and is known to occasionally mark individuals who have merely a loose connection to militant groups, or no connection at all,” he wrote, adding many targets were attacked at their family home at night because it was “easier to locate” them.

Loading

“The result, as the sources testified, is that thousands of Palestinians – most of them women and children or people who were not involved in the fighting – were wiped out by Israeli airstrikes, especially during the first weeks of the war, because of the AI program’s decisions.”

Quoting his sources, Abraham said Lavender had picked some 37,000 Hamas militants, “most of them junior, for assassination”.

Guterres’ remarks come on the heels of the decision by Israel to dismiss two officers over a missile strike that killed seven aid workers, including Australian Zomi Frankcom, earlier this week, in what it described as a “grave mistake stemming from a serious failure”. Israel hasn’t indicated whether AI was used in that attack.

The United Nations General Assembly adopted last month a nonbinding resolution to promote “safe, secure and trustworthy” AI systems. The US-led proposal, which was co-sponsored by more than 110 countries including China, didn’t cover the military use of AI.

Asked earlier if the Biden administration would raise military uses of AI at the Security Council, Ambassador Linda Thomas-Greenfield said the US didn’t have “any intention” of doing so. She said the resolution “lays down the foundation” for how to address AI in future.

People inspect damage and recover items in Rafah following Israeli air strikes on March 26.

People inspect damage and recover items in Rafah following Israeli air strikes on March 26.Credit: Getty

White House national security spokesperson John Kirby told CNN the US had not verified the content of the media report.

The Israel Defence Forces (IDF) denies AI was used to identify suspected extremists and targets.

“The IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process,” the IDF said in a statement.

It added that IDF directives mandated analysts to conduct independent examinations to verify that the identified targets meet the relevant definitions in line with international law and Israeli guidelines.

Bloomberg, Reuters

Get a note directly from our foreign correspondents on what’s making headlines around the world. Sign up for the weekly What in the World newsletter here.

Most Viewed in World

Loading

Original URL: https://www.smh.com.au/world/middle-east/israel-is-reportedly-using-ai-to-pick-gaza-targets-to-assassinate-at-their-family-home-20240406-p5fhue.html