LONDON: Kill lists drawn up from vast databases, facial recognition cameras tracking targets, quadcopter drones mounted with machine guns. The Israeli military’s artificial intelligence-powered systems played a central role in the Gaza war.
The ruthless efficiency of AI programs to process data and produce bombing targets combined with reports of limited human oversight has been blamed in part for the extremely high number of civilian casualties.
The scale with which automation and machine learning were used and developed during the war has led military experts to conclude that the world is now at a turning point in how wars of the future will be fought.

An Israeli soldier prepares an Elbit Systems Skylark I unmanned aerial vehicle (UAV or drone) for take-off near the border with the Gaza Strip in southern Israel on August 21, 2020, as part of monitoring operations in the area. (AFP)
For Palestinians, the legacy of this AI-driven conflict goes beyond the immediate trail of death and destruction. These technologies will very likely be channeled back into the occupation, entrenching what many describe as an “automated apartheid.”
Israel has long been accused of using the occupied Palestinian territories as a laboratory to develop sophisticated military and surveillance technologies.
It has spent years gathering vast quantities of intelligence and data from Gaza and the occupied West Bank. Despite this, during previous wars with Palestinian militant groups in Gaza in 2014 and 2021, the Israeli Air Force actually ran out of targets to hit.
“They were hitting everything they had and that they could identify during the war, and they would run out of targets,” Noah Sylvia, a research analyst at the UK-based Royal United Services Institute, told Arab News.
“So they created a bank of targets in the event of the next war … a database of tens of thousands of targets to strike as needed.”

An Israeli soldier launches a drone near the Israel-Gaza border, amid the ongoing conflict between Israel and the Palestinian Islamist group Hamas, in southern Israel, on January 15, 2024. (REUTERS/File)
A book published in 2021 by Yossi Sariel, at the time commander of Unit 8200, Israel’s elite cyber-warfare agency, offered a chilling indication of the role AI would play in creating such a target bank.
“Imagine 80,000 relevant targets that are produced before combat and 1,500 new targets created every day during a war,” he wrote in “The Human-Machine Team,” which described how AI could transform the way wars are fought.
Human beings, he wrote, were “the bottleneck” preventing the creation and approval of those targets. “A team consisting of machines and investigators can blast the bottleneck wide open.”
When Israel launched its military campaign in Gaza in retaliation for the Hamas-led attack of Oct. 7, 2023, this vision became a reality.
During the first week of the war, the Israeli military reported dropping 1,000 bombs a day. By early December 2023, it reported 10,000 airstrikes.

An action shot of XTEND’s XTENDER drone, used for tactical indoor reconnaissance. (XTEND)
Studies estimated that the initial months of the conflict amounted to one of the most intensive bombing campaigns in history, with levels of destruction comparable to the bombing of three German cities, Dresden, Hamburg and Cologne, during the Second World War.
Details of the AI-powered systems driving the campaign started to emerge in reports from the Israeli magazine +972.
An investigation published in November 2023 revealed that the Israeli military was using a system called “The Gospel,” which selected buildings as targets far faster than had previously been possible.
One former Israeli intelligence officer described the system as a “mass assassination factory.”
Five months later, +972 revealed the existence of the “Lavender” program, which instead of selecting structures as targets, selected people. The system made even the lowest ranked members of Hamas and Islamic Jihad targets for the air force’s bombs.
Sources told the magazine that the system selected 37,000 suspected militants and their homes as possible airstrike targets in the early days of the conflict.

An Israeli drone drops tear gas canisters to disperse gatherings in the vicinity of Ofer military prison located between Ramallah and Beitunia in the occupied West Bank on October 13, 2025. (AFP)
The article revealed another system named “Where’s Daddy?” that could simultaneously track the thousands of individuals flagged by Lavender and send a signal when they reached their family homes.
According to the report, the military preferred to bomb them in their homes, usually at night in the presence of their families, because it was easier to locate them there.
Sources also told +972 that during the early weeks of the war, the army decided it was acceptable to kill 15 to 20 civilians for every low-ranking Hamas militant and, on occasions, 100 civilians for a senior commander.
There were also alarming details about the levels of human oversight of the targets supplied by the AI programs, known as Decision Support Systems.

Palestinians tend to an injured man as he speaks on his mobile phone following an Israeli drone strike in Jabalia, in the northern Gaza Strip on May 23, 2025. (AFP)
Sources described a “rubber stamp” approach to the targets flagged by the systems, with a mere 20 seconds spent on each one before a bombing was authorized.
“That is going to make it near impossible for meaningful, substantive human input in a targeting selection process,” Asaf Lubin, an associate professor at Indiana University Maurer School of Law and a former Israeli intelligence analyst, told Arab News.
Israel insists that the AI systems it uses are merely tools to help identify targets and that all targets are independently verified by an intelligence analyst as legitimate to attack.
But Lubin suggests that the quantity of targets being generated would have made it impossible for a human to carry out proper verification or allow for the capacity to challenge the information.
“Our entire legal frameworks in international humanitarian law and the laws of war are rooted on an understanding that there will be a process, an iterative process, for those involved in the targeting decision making,” he said.
“Automation bias and technology played a role in loosening that process, that iterative process, that would have allowed for more review, scrutiny, analysis, questioning.”

Palestinians run for cover during an Israeli drone strike in Jabalia, in the northern Gaza Strip on May 23, 2025. (AFP)
Anwar Mhajne, an associate professor of political science at Stonehill College in Massachusetts, said the AI systems gave Israel’s military a “facade of confidence” in their ability to select targets.
She said this led to a “confirmation bias” based on the Israeli forces’ existing view of Palestinians in Gaza.
“If the person approving the targets has a bias of his own, then it’s easy to just say, ‘oh, well, the data told me that it is, so it’s not my fault’.”
Mhajne added: “The way the weapons have been used, all these weapons, all these systems helped facilitate genocide in Gaza.”
Experts agree that while AI played a part, the ultimate responsibility for the huge civilian death toll rests with Israel and the conduct of its forces.
“There’s no question that AI systems were utilized and they generated kill lists and target lists in ways not seen before in the previous confrontations between the Israelis and Palestinians,” Lubin said.

Illustration image courtesy of Gemini
However, he added, the scale of the civilian death toll “can also be tied to decisions that have nothing to do with technology, that have everything to do with policy changes and the way the military decided to operate in the context of this operation and in the wake of Oct. 7.”
Israel has so much information on Gaza that they “know exactly what they’re doing, at any given time,” RUSI’s Sylvia said. “They know how many casualties there are going to be.
“What artificial intelligence does is that it takes your existing operating procedures and allows you to do them more quickly and at a greater scale.
“How ethically you use artificial intelligence correlates largely with your existing operating procedures.”
Another legacy of the Gaza war is the involvement of major international technology companies in Israel’s military AI systems.
Microsoft said in September it had cut off some of its services to Unit 8200 after media reports that the Azure cloud was being used to store intercepted phone calls made by ordinary Palestinians.

Pro-Palestinian demonstrators protest outside the Microsoft Build conference at the Seattle Convention Center in Seattle, Washington on May 19, 2025. (AFP)
It is those ordinary Palestinians who will now live under the shadow of the new AI-powered military and surveillance technologies developed by Israel during the conflict.
“Automated apartheid is not only becoming more real, it is accelerating,” Jalal Abukhater, policy manager for 7amleh, a non-profit organization that advances the digital rights of Palestinians, told Arab News.
“The AI systems refined during the war, whether for predictive analytics, biometric monitoring, automated targeting, or mass data extraction, will entrench and expand the occupation and apartheid regime.
“These tools don’t disappear after conflict — they become part of everyday governance.”
Mhajne, who is a Palestinian citizen of Israel, said the use of AI is both entrenching the occupation and making it “more and more sophisticated.”
“Mainly why Israel is able to produce all of this is because of the massive institutions of occupation,” she said.

A THOR vertical takeoff and landing (VTOL) micro-unmanned aerial system (UAS), developed as military tactical mule platform and part of the "Legion-X" line of robotic and autonomous combat solutions produced by the Israel-based international defense electronics company Elbit Systems, is pictured during a press demonstration at their headquarters in Ramat HaSharon in central Israel on July 10, 2023. (AFP)
The development of AI military systems raises fears that the future will be dominated by killer robots — autonomous drones or vehicles making decisions on who lives and who dies.
The Gaza war showed that the automated systems working in the background to make decisions on who or what is a target are already in place.
When used with the same recklessness that was apparent in Gaza, the consequences are catastrophic — nearly 70,000 Palestinians were killed in two years, according to Gazan health authorities.
With the rapid development of automated warfare, at a time when international legal norms and rules are coming under strain, Lubin sees Gaza as a “pivotal moment” that will “redefine wartime activity.”
“It’s not just about the targeting cycle. It’s about every aspect of the war machine,” he said.
“It’s a very dangerous time we’re entering into.”














