“This technology is our future threat,” warns Serhiy Beskrestnov as he inspects a newly intercepted Russian drone. It is no ordinary drone, he explains. Assisted by artificial intelligence, this unmanned aircraft can find and attack targets by itself.
Beskrestnov, a consultant for Ukraine’s defence forces, has examined many drones during the war. Yet this one stands out. Unlike earlier models, it neither sends nor receives signals, which makes it impossible to jam.
Both Russian and Ukrainian forces are testing AI on the battlefield. In some areas, they already use it for finding targets, gathering intelligence and clearing mines.
Artificial intelligence becomes a battlefield necessity
For Ukraine’s army, artificial intelligence has become essential. “Our military receives more than 50,000 video streams from the front every month,” says Deputy Defence Minister Yuriy Myronenko. “AI analyses this footage, processes the data quickly, identifies targets and places them on a map.”
AI-powered technology is seen as a vital tool that enhances strategy, optimises resources and saves lives. But in unmanned weapon systems, it is also changing the nature of combat. Ukrainian troops already use AI-driven software that allows drones to lock on targets and then fly autonomously during the final stretch of a mission.
Such drones cannot be jammed and are extremely hard to shoot down. Experts expect these systems to evolve into fully autonomous weapons capable of finding and destroying targets without direct control.
Drones that fight on their own
“All a soldier will need to do is press a button on a smartphone app,” says Yaroslav Azhnyuk, chief executive of the Ukrainian tech firm The Fourth Law. The drone will handle everything else—finding the target, dropping explosives, assessing the damage and returning to base. “And it would not even require piloting skills from the soldier,” he adds.
Interceptor drones with this level of automation could strengthen Ukraine’s air defences against long-range Russian attack drones, such as the Shaheds. “A computer-guided autonomous system can outperform a human in many ways,” Azhnyuk explains. “It can detect a target faster and move with greater agility.”
Deputy Minister Myronenko admits such a system does not fully exist yet but insists Ukraine is close to completing development. “We have partly implemented it in some devices,” he says. Azhnyuk predicts thousands of these systems could be deployed by the end of 2026.
The limits and dangers of full autonomy
Ukrainian developers remain cautious about relying entirely on AI for defence. The greatest danger lies in recognition errors. “AI might fail to tell a Ukrainian from a Russian soldier, since they wear similar uniforms,” says Vadym, a developer who asked to withhold his surname.
His company, DevDroid, makes remotely operated machine guns that use AI to detect and track people automatically. Yet because of the risk of friendly fire, they have disabled automatic firing. “We could enable it,” Vadym says, “but we need more experience and feedback from ground forces to know when it’s safe to use.”
There are also ethical fears. Automated systems might breach the laws of war. How will they distinguish civilians or soldiers who wish to surrender? Myronenko insists that final decisions should remain in human hands, although AI could make those decisions faster. Still, there is no guarantee that states or armed groups will respect humanitarian norms.
A new kind of global arms race
As AI becomes a key weapon, countermeasures grow crucial. How do you stop a swarm of drones when jamming, tanks or missiles no longer work?
Ukraine’s successful “Spider Web” operation last June, when 100 drones struck Russian air bases, likely relied on AI support. Many in Ukraine fear Moscow will copy that approach both at the front and beyond it.
President Volodymyr Zelensky recently told the United Nations that AI was driving “the most destructive arms race in human history.” He urged world leaders to establish global rules for AI in warfare, warning that the issue was “as urgent as preventing nuclear proliferation.”

