Israel's AI targeting of Gaza criticised for potential analytical errors

Experts identify systemic flaw as tech conference is told 'no one will be able to escape digital surveillance'

The aftermath of Israeli bombing in Rafah. The use by the military of AI systems to target Hamas has come under question. AFP
Powered by automated translation

Live updates: Follow the latest on Israel-Gaza

Technology experts have warned Israel's military of potential “extreme bias error” in relying on Big Data for targeting people in Gaza while using artificial intelligence programmes.

The Israeli military has reportedly to be using two AI systems, Gospel and Lavender, to track down Hamas operatives and speed up missile strikes but they are controversial, with some suggesting they have contributed to the high number civilian casualties, with more than 34,800 Palestinians killed.

Big Data, defined as large, diverse sets of information that grow at ever-increasing rates, has now become so widespread and powerful with the rise of AI that if not “in the not too distant future, no one will be able to escape digital surveillance”, Dr Miah Hammond-Errey told a Rusi think tank webinar.

Israel’s use of powerful AI systems has led its military to enter territory for advanced warfare not previously witnessed at such a scale between soldiers and machines.

The Lavender system is understood to have processed huge amounts of personal data from Gaza, allowing it to quickly militant enemy profiles, with up to 37,000 Palestinian men linked by the system to Hamas or Palestinian Islamic Jihad.

It is also alleged that Israeli strike operators, using AI, are allowed to kill up to 20 civilians per attack if the target is deemed an appropriate rank.

Unverified reports say the AI systems had “extreme bias error, both in the targeting data that's being used, but then also in the kinetic action”, Ms Hammond-Errey said in response to a question from The National. Extreme bias error can occur when a device is calibrated incorrectly, so it miscalculates measurements.

The AI expert and director of emerging technology at the University of Sydney suggested broad data sets “that are highly personal and commercial” mean that armed forces “don't actually have the capacity to verify” targets and that was potentially “one contributing factor to such large errors”.

She said it would take “a long time for us to really get access to this information”, if ever, “to assess some of the technical realities of the situation”, as the fighting in Gaza continues.

Prof Sir David Omand, former head of Britain’s GCHQ surveillance centre, urged against “jumping to conclusions” over Israel’s AI use, as its military had not given independent access to its system.

“We just have to be a bit careful before assuming these almost supernatural powers to large data sets on what has been going on in Gaza, and just remember that human beings are setting the rules of engagement,” he said.

“If things are going wrong, it’s because human beings have the wrong rules, not because the machines are malfunctioning.”

Israeli’s use of Lavender and Gospel “would likely form a test case for how the international community and tech companies respond to the use of AI”.

Dr Hammond-Errey, author of Big Data, Emerging Technologies and Intelligence, argues that for national security agencies the “Big Data landscape offers the potential for this invasive targeting and surveillance of individuals”, not only by states but others not governed by rules.

“If we aren't there already, in the not-too-distant future no one will be able to escape digital surveillance.”

Big Data could give armies “military dominance”, as it offers “imperfect global situational awareness but on a scale previously not considered”, especially when connected to space targeting systems.

Aligned with AI, Big Data can compile “comprehensive profiles” of people, institutions, political groups and nation states that “can be made remotely and very quickly”.

Dr Hammond-Errey also warned that Big Data had been used around the world to target individuals and specific groups, exploiting “individual psychological weaknesses” as well as interfering with elections.

Updated: May 12, 2024, 7:57 AM