Nations meet at UN for ‘killer robot’ talks as regulation lags

By Olivia Le Poidevin

GENEVA (Reuters) -Countries are meeting at the United Nations on Monday to revive efforts to regulate the kinds of AI-controlled autonomous weapons increasingly used in modern warfare, as experts warn time is running out to put guardrails on new lethal technology.

Autonomous and artificial intelligence-assisted weapons systems are already playing a greater role in conflicts from Ukraine to Gaza. And rising defence spending worldwide promises to provide a further boost for burgeoning AI-assisted military technology.

Progress towards establishing global rules governing their development and use, however, has not kept pace. And internationally binding standards remain virtually non-existent.

Since 2014, countries that are part of the Convention on Conventional Weapons (CCW) have been meeting in Geneva to discuss a potential ban fully autonomous systems that operate without meaningful human control and regulate others.

U.N. Secretary-General Antonio Guterres has set a 2026 deadline for states to establish clear rules on AI weapon use. But human rights groups warn that consensus among governments is lacking.

Alexander Kmentt, head of arms control at Austria’s foreign ministry, said that must quickly change.

“Time is really running out to put in some guardrails so that the nightmare scenarios that some of the most noted experts are warning of don’t come to pass,” he told Reuters.

Monday’s gathering of the U.N. General Assembly in New York will be the body’s first meeting dedicated to autonomous weapons.

Though not legally binding, diplomatic officials want the consultations to ramp up pressure on military powers that are resisting regulation due to concerns the rules could dull the technology’s battlefield advantages.

Campaign groups hope the meeting, which will also address critical issues not covered by the CCW, including ethical and human rights concerns and the use of autonomous weapons by non-state actors, will push states to agree on a legal instrument.

They view it as a crucial litmus test on whether countries are able to bridge divisions ahead of the next round of CCW talks in September.

“This issue needs clarification through a legally binding treaty. The technology is moving so fast,” said Patrick Wilcken, Amnesty International’s Researcher on Military, Security and Policing.

“The idea that you wouldn’t want to rule out the delegation of life or death decisions … to a machine seems extraordinary.”

ARMS RACE

The New York talks come after 164 states supported a 2023 U.N. General Assembly resolution calling for the international community to urgently address the risks posed by autonomous weapons.

While many countries back a binding global framework, the United States, Russia, China and India prefer national guidelines or existing international laws, according to Amnesty.

“We have not been convinced that existing law is insufficient,” a U.S. Pentagon spokesperson told Reuters, adding that autonomous weapons might actually pose less risk to civilians than conventional weapons.

The governments of India, Russia, and China did not respond to requests for comment.

In the absence of regulation, autonomous systems are proliferating.

Weapons experts at the Future of Life Institute think tank have tracked the deployment of roughly 200 autonomous weapon systems across Ukraine, the Middle East and Africa.

Russian forces, for example, have deployed some 3,000 Veter kamikaze drones – capable of autonomously detecting and engaging targets – to Ukraine, according to its data.

Ukraine has, meanwhile, used semi-autonomous drones in the conflict. The Ukrainian government declined to comment.

Israel has used AI-systems to identify targets in Gaza. Its mission in Geneva said it supported multilateral discussions and uses data technologies in full accordance with international law.

Human Rights Watch, however, said crucial questions of accountability under international law remain unresolved and warned in a report last month that unregulated autonomous weapons present a range of threats to human rights and could provoke an arms race if unchecked.

And campaigners like Laura Nolan of Stop Killer Robots worry there is currently little to ensure defence firms will develop AI-driven weapons responsibly.

“We do not generally trust industries to self-regulate … There is no reason why defence or technology companies should be more worthy of trust,” she said.

(Reporting by Olivia Le Poidevin; Editing by Joe Bavier)

tagreuters.com2025binary_LYNXMPEL4B0IV-VIEWIMAGE