Laws for LAWS

Published June 4, 2023
The writer is an international lawyer at the Conflict Law Centre.
The writer is an international lawyer at the Conflict Law Centre.

IN August 2020, America’s Defence Advanced Research Projects Agency held a simulated dogfight between F-16s in which one jet was piloted by an artificial intelligence algorithm and the other by an experienced human pilot. AI won 15-0. It fought better than human pilots, in that it was faster, more precise and more aggressive, but it also fought differently, moving in ways humans never had and breaking rules pilots follow in close combat to avoid the risk of collision. The need to harness this new technology is leading to an ‘AI arms race’ between states.

Meanwhile, the UN, as always, moves as fast as countries wish it to move, which is to say barely at all. Negotiations between states to create legally binding rules applicable to ‘lethal autonomous weapons systems’ (LAWS) have now entered a stalemate for the tenth year running. LAWS are weapons systems which think for themselves, in that they select and engage targets without further human intervention. While many states favour prohibitions and regulations on their use in battle (Pakistan included), others (especially Russia and India) are staunchly against it. Instead, they argue that the rules of war are enough to regulate AI in armed conflict, and have been successfully kicking the can down the road.

Autonomous weapons systems have already been used in hostilities. In March 2020, a Turkish-made drone attacked retreating soldiers loyal to Haftar in Libya. A year later, in June 2021, Israel used an AI drone swarm in Gaza to locate, identify and attack Hamas militants. The borders of Israel, Russia and South Korea are currently patrolled by partially autonomous weapons systems and Israel has organised the assassinations of Iranian nuclear scientists through automated machine guns without any of their personnel present at the scene of the attack. Most worrying for Pakistan is the fact that India has also acquired its first swarming drone system which is AI-enabled and can attack targets up to 50 kilometres away.

As AI outpaces us it becomes less predictable.

LAWS are already here but the laws of war are not enough to regulate them. These rules were made at a time when armies still faced each other across a battlefield. The drafters of the Geneva Conventions did not foresee combat by robots without human involvement. It may be possible for AI to comply with some rules, such as that civilians cannot be targeted while combatants can be. But when these rules become fuzzy even for humans to decipher (eg a civilian can be targeted when directly participating in hostilities though what counts as ‘direct participation’ is debated), it is difficult to know how an AI agent can. Moreover, as AI outpaces us it becomes less predictable and controllable. The ‘black box’ issue of software means its workings will be opaque to us and we might not know the difference between a system error and good tactics. The question arises then of who should be held accountable for violations of the laws of war. Although a robot has been given citizenship in Saudi Arabia of all places, holding one accountable for war crimes would be like a Roomba standing trial for running over an insect.

Those in the pro-AI camp counter that ‘inhumanity’ in war is only due to, well, humans. Humans feel fear, get hungry, want vengeance for their fallen comrade, forget orders. Robots may do a better job on the battlefield as they lack the emotions which cloud human judgement. After all, robots do not rape.

Though there remains the ethical conundrum; that the life of a human should be on a human being’s conscience, this is becoming less and less convincing as we have already removed soldiers far from the battlefield and having to feel the weight of someone’s death. Peter Singer in his book, Wired for War, quoted an unnamed drone pilot who said: “The truth is, it isn’t all I thought it was cracked up to be. I mean, I thought killing somebody would be this life-changing experience. And then I did it, and I was like ‘All right, whatever…’.” War has already, in so many ways, been reduced to a video game.

As AI becomes more acceptable to us, states will inevitably use it in war. Even if it is originally employed outside of combat, such as using a robot to check an IED on the side of the road, its use may eventually creep into the battlefield.

Lethal autonomous weapons have been described as the “third revolution of warfare” after gunpowder and nuclear weapons. When Oppenheimer witnessed the first detonation of a nuclear weapon, he famously said, “now I am become Death, the destroyer of worlds”. What is less known is a man called Bainbridge’s perhaps more apt follow-up: “Now we are all sons of b******”. As we usher in a new era of AI warfare, it remains for us to decide whether we will allow a computer to delete us.

The writer is an international lawyer at the Conflict Law Centre.

Published in Dawn, June 4th, 2023

Opinion

Editorial

Rigging claims
Updated 04 May, 2024

Rigging claims

The PTI’s allegations are not new; most elections in Pakistan have been controversial, and it is almost a given that results will be challenged by the losing side.
Gaza’s wasteland
04 May, 2024

Gaza’s wasteland

SINCE the start of hostilities on Oct 7, Israel has put in ceaseless efforts to depopulate Gaza, and make the Strip...
Housing scams
04 May, 2024

Housing scams

THE story of illegal housing schemes in Punjab is the story of greed, corruption and plunder. Major players in these...
Under siege
Updated 03 May, 2024

Under siege

Whether through direct censorship, withholding advertising, harassment or violence, the press in Pakistan navigates a hazardous terrain.
Meddlesome ways
03 May, 2024

Meddlesome ways

AFTER this week’s proceedings in the so-called ‘meddling case’, it appears that the majority of judges...
Mass transit mess
03 May, 2024

Mass transit mess

THAT Karachi — one of the world’s largest megacities — does not have a mass transit system worth the name is ...