Skip to main content

We need your support today

Independent journalism is more important than ever. Vox is here to explain this unprecedented election cycle and help you understand the larger stakes. We will break down where the candidates stand on major issues, from economic policy to immigration, foreign policy, criminal justice, and abortion. We’ll answer your biggest questions, and we’ll explain what matters — and why. This timely and essential task, however, is expensive to produce.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Support Vox

How AI tells Israel who to bomb

AI is supposed to help militaries make precise strikes. Is that the case in Gaza?

Rajaa Elidrissi is a researcher and producer on the Vox video team, where she works on Vox Atlas and other videos that focus on global issues.

Israel’s war with Hamas, in response to the attacks of October 7, 2023, has led to more fatalities than in any previous Israeli war, with at least 34,000 Palestinians killed as of May 7, 2024. In Israel’s 2014 war in Gaza, just over 1,400 were killed. One factor in that difference is the use of artificial intelligence.

Israel’s incorporation of AI in warfare has been public for years through both defensive and offensive weapons. But in this war, AI is being deployed differently: It’s generating bombing targets. The promise of AI in a military context is to enhance strike precision and accuracy, but over the past few months Israeli outlets +972 magazine and Local Call have revealed that the multiple AI systems that help the IDF select targets in Gaza have contributed to the highest number of Palestinian civilian deaths and injuries ever.

In our video, we interview multiple experts to understand how two specific systems, Gospel and Lavender, operate, and we explore the broader implications of current and future AI use in warfare.

More in Video