In the wake of the brutal October 7, 2023, attack by Hamas, the Israel Defense Forces (IDF) launched a relentless bombing campaign on Gaza, using a database of home addresses, tunnels, and critical infrastructure linked to the militant group.
However according to the Washington Post, as the conflict raged on and the target bank dwindled, the IDF turned to an advanced artificial intelligence tool named Habsora, or "the Gospel," to swiftly generate additional targets.
This AI-driven approach enabled the military to sustain its operations at a rapid pace.
The Post said that for the last ten years, the IDF has been trying to merge different AI tools into its operations. People with knowledge of the IDF's practices, including soldiers who served in the conflict, assert that the military has significantly increased the number of acceptable civilian casualties, a shift some attribute to the automation of target generation. This has facilitated the rapid identification of a vast number of targets, including low-level militants involved in the October 7 attacks.
The IDF, in a statement to The Post, claim that AI tools have, in fact, reduced collateral damage and improved the accuracy of the human-led process. It emphasised that the AI systems do not make autonomous decisions. Each recommendation from the "big data processing" systems requires approval from an officer before being added to the target bank by a senior officer.
Another AI tool, known as Lavender, uses a percentage score to estimate the likelihood of a Palestinian being a militant. This system allowed the IDF to quickly produce a large volume of potential human targets.
However, the rule mandating two pieces of human-derived intelligence to verify Lavender's predictions was reduced to one at the war's outset.
In some instances, within the Gaza division, soldiers inadequately trained in the technology attacked human targets without checking Lavender's predictions first.
Steven Feldstein, a senior fellow at the Carnegie Endowment who studies the use of AI in warfare said the accuracy and the accelerated speed of these systems was resulting in a heavy body count.
"The ethical implications of AI in modern warfare demand urgent scrutiny and robust safeguards," Feldstein said.