KYIV, Ukraine (AP) — Drone advances in Ukraine have accelerated a long-anticipated technology trend that could soon bring the world’s first fully autonomous fighting robots to the battlefield, inaugurating a new age of warfare.
The longer the war lasts, the more likely it becomes that drones will be used to identify, select and attack targets without help from humans, according to military analysts, combatants, and artificial intelligence researchers.
That would mark a revolution in military technology as profound as the introduction of the machine gun. Ukraine already has semi-autonomous attack drones and counter-drone weapons endowed with AI. Russia also claims to possess AI weaponry, although the claims are unproven. But there are no confirmed instances of a nation putting into combat robots that have killed entirely on their own.
Experts say it may be only a matter of time before either Russia or Ukraine, or both, deploy them.
“Many states are developing this technology,” said Zachary Kallenborn, a George Mason University weapons innovation analyst. “Clearly, it’s not all that difficult.”
The sense of inevitability extends to activists, who have tried for years to ban killer drones but now believe they must settle for trying to restrict the weapons’ offensive use.
Ukraine’s digital transformation minister, Mykhailo Fedorov, agrees that fully autonomous killer drones are “a logical and inevitable next step” in weapons development. He said Ukraine has been doing “a lot of R&D in this direction.”
“I think that the potential for this is great in the next six months,” Fedorov told The Associated Press in a recent interview.
Ukrainian Lt. Col. Yaroslav Honchar, co-founder of the combat drone innovation nonprofit Aerorozvidka, said in a recent interview near the front that human war fighters simply cannot process information and make decisions as quickly as machines.
Ukrainian military leaders currently prohibit the use of fully independent lethal weapons, although that could change, he said.
“We have not crossed this line yet – and I say ‘yet’ because I don’t know what will happen in the future,” said Honchar, whose group has spearheaded drone innovation in Ukraine, converting cheap commercial drones into lethal weapons.
Russia could obtain autonomous AI from Iran or elsewhere. The long-range Shahed-136 exploding drones supplied by Iran have crippled Ukrainian power plants and terrorized civilians but are not particularly smart. Iran has other drones in its evolving arsenal that it says feature AI.
Without a great deal of trouble, Ukraine could make its semi-autonomous weaponized drones fully independent in order to better survive battlefield jamming, their Western manufacturers say.
Those drones include the US-made Switchblade 600 and the Polish Warmate, both of which currently require a human to choose targets over a live video feed. AI finishes the job. The drones, technically known as “loitering munitions,” can hover for minutes over a target, awaiting a clean shot.
“The technology to achieve a fully autonomous mission with Switchblade pretty much exists today,” said Wahid Nawabi, CEO of AeroVironment, its maker. That will require a policy change — to remove the human from the decision-making loop — that he estimates is three years away.
Drones can already recognize targets such as armored vehicles using cataloged images. But there is disagreement over whether the technology is reliable enough to ensure that the machines don’t err and take the lives of non-combatants.
The AP asked the defense ministries of Ukraine and Russia if they have used autonomous weapons offensively – and whether they would agree not to use them if the other side similarly agreed. Neither responded.
If either side were to go on the attack with full AI, it might not even be a first.
An inconclusive UN report suggested that killer robots debuted in Libya’s internecine conflict in 2020, when Turkish-made Kargu-2 drones in full-automatic mode killed an unspecified number of combatants.
A spokesperson for STM, the manufacturer, said the report was based on “speculative, unverified” information and “should not be taken seriously.” He told the AP that the Kargu-2 cannot attack a target until the operator tells it to do so.
Fully autonomous AI is already helping to defend Ukraine. Utah-based Fortem Technologies has supplied the Ukrainian military with drone-hunting systems that combine small radars and unmanned aerial vehicles, both powered by AI. The radars are designed to identify enemy drones, which the UAVs then disable by firing nets at them — all without human assistance.
The number of AI-endowed drones keeps growing. Israel has been exporting them for decades. Its radar-killing Harpy can hover over anti-aircraft radar for up to nine hours waiting for them to power up.
Other examples include Beijing’s Blowfish-3 unmanned weaponized helicopter. Russia has been working on a nuclear-tipped underwater AI drone called the Poseidon. The Dutch are currently testing a ground robot with a .50-caliber machine gun.
Honchar believes Russia, whose attacks on Ukrainian civilians have shown little regard for international law, would have used autonomous killer drones by now if the Kremlin had them.
“I don’t think they’d have any scruples,” agreed Adam Bartosiewicz, vice president of WB Group, which makes the Warmate.
AI is a priority for Russia. President Vladimir Putin said in 2017 that whoever dominates that technology will rule the world. In a December 21 speech, he expressed confidence in the Russian arms industry’s ability to embed AI in war machines, stressing that “the most effective weapons systems are those that operate quickly and practically in an automatic mode.”
Russian officials already claim their Lancet drone can operate with full autonomy.
“It’s not going to be easy to know if and when Russia crosses that line,” said Gregory C. Allen, former director of strategy and policy at the Pentagon’s Joint Artificial Intelligence Center.
Switching a drone from remote piloting to full autonomy might not be perceptible. To date, drones capable of working in both modes have performed better when piloted by a human, Allen said.
The technology is not particularly complicated, said University of California-Berkeley professor Stuart Russell, a top AI researcher. In the mid-2010s, colleagues he polled agreed that graduate students could, in a single term, produce an autonomous drone “capable of finding and killing an individual, let’s say, inside a building,” he said.
An effort to lay international ground rules for military drones has so far been fruitless. Nine years of informal United Nations talks in Geneva made little headway, with major powers including the United States and Russia opposing a ban. The last session, in December, ended with no new round scheduled.
Washington policymakers say they won’t agree to a ban because rivals developing drones cannot be trusted to use them ethically.
Toby Walsh, an Australian academic who, like Russell, campaigns against killer robots, hopes to achieve a consensus on some limits, including a ban on systems that use facial recognition and other data to identify or attack individuals or categories of people.
“If we are not careful, they are going to proliferate much more easily than nuclear weapons,” said Walsh, author of “Machines Behaving Badly.” “If you can get a robot to kill one person, you can get it to kill a thousand.”
Scientists also worry about AI weapons being repurposed by terrorists. In one feared scenario, the US military spends hundreds of millions writing code to power killer drones. Then it gets stolen and copied, effectively giving terrorists the same weapon.
To date, the Pentagon has neither clearly defined “an AI-enabled autonomous weapon” nor authorized a single such weapon for use by US troops, said Allen, the former Defense Department official. Any proposed system must be approved by the chairman of the Joint Chiefs of Staff and two undersecretaries.
That’s not stopping the weapons from being developed across the US. Projects are underway at the Defense Advanced Research Projects Agency, military labs, academic institutions and in the private sector.
The Pentagon has emphasized using AI to augment human warriors. The Air Force is studying ways to pair pilots with drone wingmen. A booster of the idea, former Deputy Defense Secretary Robert O. Work, said in a report last month that it “would be crazy not to go to an autonomous system” once AI-enabled systems outperform humans — a threshold that he said was crossed in 2015, when computer vision eclipsed that of humans.
Humans have already been pushed out in some defensive systems. Israel’s Iron Dome missile shield is authorized to open fire automatically, although it is said to be monitored by a person who can intervene if the system goes after the wrong target.
Multiple countries, and every branch of the US military, are developing drones that can attack in deadly synchronized swarms, according to Kallenborn, the George Mason researcher.
So will future wars become a fight to the last drone?
That’s what Putin predicted in a 2017 televised chat with engineering students: “When one party’s drones are destroyed by drones of another, it will have no other choice but to surrender.”