AI-assisted targeting in the Gaza Strip
These tools include the Gospel, an AI which automatically reviews surveillance data looking for buildings, equipment and people thought to belong to the enemy, and upon finding them, recommends bombing targets to a human analyst who may then decide whether to pass it along to the field.Critics have argued the use of these AI tools puts civilians at risk, blurs accountability, and results in militarily disproportionate violence in violation of international humanitarian law."[8] Bianca Baggiarini, lecturer at the Australian National University's Strategic and Defence Studies Centre wrote AIs are "more effective in predictable environments where concepts are objective, reasonably stable, and internally consistent."[15] Khlaaf went on to point out that such a system's decisions depend entirely on the data it's trained on,[b] and are not based on reasoning, factual evidence or causation, but solely on statistical probability.[17] In the France 24 interview Abraham, of +972 Magazine, characterized this as enabling the systematization of dropping a 2000 lb bomb into a home to kill one person and everybody around them, something that had previously been done to a very small group of senior Hamas leaders.[27] NPR cited a report by +972 Magazine and its sister publication Local Call as asserting the system is being used to manufacture targets so that Israeli military forces can continue to bombard Gaza at an enormous rate, punishing the general Palestinian population.[16][29] Richard Moyes, researcher and head of the NGO Article 36, pointed to "the widespread flattening of an urban area with heavy explosive weapons" to question these claims,[29] while Lucy Suchman, professor emeritus at Lancaster University, described the bombing as "aimed at maximum devastation of the Gaza Strip".The six said Lavender had played a central role in the war, rapidly processing data to identify potential junior operatives to target, at one point listing as many as 37,000 Palestinian men linked by AI to Hamas or PIJ.[41] Experts in ethics, AI, and international humanitarian law have criticized the use of such AI systems along ethical and legal lines, arguing that they violate basic principles of international humanitarian law, such as military necessity, proportionality, and the distinction between combatants and civilians..[42] The Guardian cited the intelligence officers' testimonies published by +972 and Local Call as saying Palestinian men linked to Hamas's military wing were considered potential targets regardless of rank or importance,[43] and low-ranking Hamas and PLJ members would be preferentially targeted at home, with one saying the system was built to look for them in these situations when attacking would be much easier.[45] Citing unnamed conflict experts, the Guardian wrote that if Israel has been using dumb bombs to flatten the homes of thousands of Palestinians who were linked with AI assistance to militant groups in Gaza, it could help explain what the newspaper called the shockingly high death toll of the war."[47] The IDF's response to the publication of the testimonies said that unlike Hamas, it is committed to international law and only strikes military targets and military operatives, does so in accordance to proportionality and precautions, and thoroughly examines and investigates exceptions;[48] that a member of an organized armed group or a direct participant in hostilities is a lawful target under international humanitarian law and the policy of all law-abiding countries;[49] that it "makes various efforts to reduce harm to civilians to the extent feasible in the operational circumstances ruling at the time of the strike"; that it chooses the proper munition in accordance with operational and humanitarian considerations; that aerial munitions without an integrated precision-guide kit are developed militaries' standard weaponry; that onboard aircraft systems used by trained pilots ensure high precision of such weapons; and that the clear majority of munitions it uses are precision-guided."[61] United Nations Secretary-General, Antonio Guterres, said he was “deeply troubled” by reports that Israel used artificial intelligence in its military campaign in Gaza, saying the practice puts civilians at risk and blurs accountability.[62] Speaking about the Lavender system, Marc Owen Jones, a professor at Hamad Bin Khalifa University stated, "Let’s be clear: This is an AI-assisted genocide, and going forward, there needs to be a call for a moratorium on the use of AI in the war".[63] Ben Saul, a United Nations special rapporteur, stated that if reports about Israel's use of AI were true, then "many Israeli strikes in Gaza would constitute the war crimes of launching disproportionate attacks".