VIENNA: The world should estab­lish a set of rules to regulate AI weapons while they’re still in their infancy, a global conference said on Tuesday, calling the issue an “Oppenheimer moment” of the time.

Like gunpowder and the atomic bomb, artificial intelligence (AI) has the capacity to revolutionise warfare, analysts say, making human disputes unimaginably different — and a lot more deadly.

“This is our generation’s ‘Oppenheimer moment’ where geopolitical tensions threaten to lead a major scientific breakthrough down a very dangerous path for the future of humanity,” read the summary at the end of the two-day conference in Vienna.

US physicist Robert Oppenhei­mer helped invent nuclear weapons during World War II. Austria organised and hosted the two-day conference in Vienna, which brought together some 1,000 participants, including political leaders, experts and members of civil society, from more than 140 countries.

A final statement said the group “affirms our strong commitment to work with urgency and with all interested stakeholders for an international legal instrument to regulate autonomous weapons systems”.

“We have a responsibility to act and to put in place the rules that we need to protect humanity... Human control must prevail in the use of force”, said the summary, which is to be sent to the UN secretary general.

Using AI, all sorts of weapons can be transformed into autonomous systems, thanks to sophisticated sensors governed by algorithms that allow a computer to “see”. This will enable the locating, selecting and attacking human targets — or targets containing human beings — without human intervention.

Most weapons are still in the idea or prototype stages, but Russia’s war in Ukraine has offered a glimpse of their potential. Remotely piloted drones are not new, but they are becoming increasingly independent and are being used by both sides.

“Autonomous weapons systems will soon fill the world’s battlefields,” Austrian Foreign Minister Alexan­der Schallenberg said on Monday when opening the conference.

He warned now was the “time to agree on international rules and norms to ensure human control”. Austria, a neutral country keen to promote disarmament in international forums, in 2023 introduced the first UN resolution to regulate autonomous weapons systems, which was supported by 164 states.

‘Uncorrectable errors’

A Vienna-based privacy campaign group said it would file a complaint against ChatGPT in Austria, claiming the “hallucinating” flagship AI tool has invented wrong answers that creator OpenAI cannot correct.

NOYB ( “None of Your Business” ) said there was no way to guarantee the programme provided accurate information. “ChatGPT keeps hallucinating — and not even OpenAI can stop it,” the group said in a statement.

The company has openly acknowledged it cannot correct inaccurate information produced by its generative AI tool and has failed to explain where the data comes from and what ChatGPT stores about individuals, said the group.

Published in Dawn, May 1st, 2024

Opinion

Editorial

Battling hate
Updated 15 Mar, 2026

Battling hate

In the current scenario, geopolitical conflict, racial prejudice and religious bigotry all contribute to the threats Muslims face.
TB drugs shortage
15 Mar, 2026

TB drugs shortage

‘CRIMINAL negligence’ is the phrase that jumps to mind when one considers the disturbing consequences of the...
Chinese diplomacy
Updated 14 Mar, 2026

Chinese diplomacy

THERE are signs that China is taking a more active role in trying to resolve the issue of cross-border terrorism...
Fragile gains at risk
14 Mar, 2026

Fragile gains at risk

PAKISTAN is confronting an external shock stemming from the US-Israel war on Iran that few of the other affected...
Kidney disease
14 Mar, 2026

Kidney disease

ON World Kidney Day this past Thursday, the Pakistan Medical Association raised the alarm on Pakistan’s...
Delicate balance
Updated 13 Mar, 2026

Delicate balance

PAKISTAN has to maintain a delicate balance where the geopolitics of the US-Israeli aggression against Iran are...