The ‘human’ machine

Published July 29, 2014
The writer is an associate professor of public international law at LUMS.
The writer is an associate professor of public international law at LUMS.

US DRONE strikes have resumed in Fata in the wake of Operation Zarb-i-Azb. There is anger over this campaign, because it leads to violations of Pakistan’s territorial sovereignty and political independence.

However, many individuals, including government officials, feel that if drones, their delivery system and operating technology were transferred to Pakistan, the state’s use of such weaponry to target militias in Fata would comply with national and international law. For them, the use of drones is preferable to other modes of military engagement with militant outfits.

However, one must approach this proposition with caution. Drones are not just legally and ethically problematic because of their challenge to state sovereignty. They are problematic for other reasons too. Most notably, their use can undermine fundamental rights.

Extrajudicial killings in the form of drone strikes contravene international human rights law, irrespective of whether such targeting is undertaken by a host or a third state. If drone attacks qualify as violations of peremptory norms of international law — norms that include torture, crimes against humanity and war crimes — then Pakistan can neither use, nor consent to the use of drone strikes on its soil.


Legitimising drone warfare will set a dangerous example.


Drone technology, inclusive of the workings of the drone delivery platform, is constantly being upgraded by militarily advanced nations employing these weapons. Effective targeting requires real-time data communication, which depends on specialised computers, satellites and an assemblage of sophisticated hardware and software.

While drone technology rapidly evolves, it naturally requires continuous R&D expenditure and the ability of a state to indigenously produce all the components required for a successful operation. Thus, if drone warfare is accepted as a legal form of engaging in armed conflict, the asymmetrical advantage that will be enjoyed by developed and militarily ad­­vanced states in warfare will prove decisive.

Pakistan cannot compete in a drones’ race with states it perceives as hostile, such as India, because of their advantage in developing advanced combative drone programmes. Pakistan should, therefore, resist the push for legitimising the use of combative drones under international law.

One should note the ethical and humanitarian challenges posed by drone warfare. The combative drone system is not unchanging. Technology is driving this form of warfare with the objective of lowering the military costs of wars through increased automation. In fact, the US Defence Department reported in 2009 “that the technological challenges regarding fully autonomous systems will be overcome by the middle of the century”.

The logical conclusion of such developments in drone technology is an unmanned system that will coordinate and direct an attack solely on the basis of a code or algorithm, without direct human involvement.

A machine cannot qualify as a lawful combatant under the law of war and hence cannot be tried or held responsible for war crimes. Even with limited human involvement, it is becoming increasingly harder to allocate responsibility because of the layers of human decision-making combined with the role machines play in conducting even a single drone strike. Social psychology experiments seem to confirm that human reliance on machines and the transference of responsibility for decision-making during drone strikes result in the disproportionate and unnecessary use of lethal force.

Another issue is that drone technology relies on signatures to target suspected militants. These signatures are based on pre-identified patterns of behaviour that allow machines to make probabilistic assessments of who and when to target. However, social and cultural differences are not accommodated. Hence, drones often end up targeting civilians. Signatures such as having a beard and carrying weapons in Fata are not useful indicators for determining who is a militant actively engaged in hostilities.

Ethically, the question arises whether the human race can accept machines making qualitative decisions about the value of life during armed conflict. What level of civilian causalities is acceptable ‘collateral damage’ when targeting a military target, and what is the proportional use of force, with regard to human suffering? The determination of a military target is itself a qualitative determination. Such assessments are difficult to make, but we accept them if made by human combatants acting reasonably.

While currently machines cannot make these assessments, with improvements in artificial intelligence, will they be able to substitute for human decision-making to measure the value of life in conflict? Are these assessments not the sole prerogative of humans? Can fully automated drones ever be capable of being coded to think emotively about the repercussions of actions that result in death? A terminator-like scenario as far as drones are concerned is not a distant reality. Hasta la vista, human rights.

The writer is an associate professor of public international law at LUMS.

Published in Dawn, July 29th, 2014

Opinion

The Dar story continues

The Dar story continues

One wonders what the rationale was for the foreign minister — a highly demanding, full-time job — being assigned various other political responsibilities.

Editorial

Wheat protests
Updated 01 May, 2024

Wheat protests

The government should withdraw from the wheat trade gradually, replacing the existing market support mechanism with an effective new one over the next several years.
Polio drive
01 May, 2024

Polio drive

THE year’s fourth polio drive has kicked off across Pakistan, with the aim to immunise more than 24m children ...
Workers’ struggle
Updated 01 May, 2024

Workers’ struggle

Yet the struggle to secure a living wage — and decent working conditions — for the toiling masses must continue.
All this talk
Updated 30 Apr, 2024

All this talk

The other parties are equally legitimate stakeholders in the country’s political future, and it must give them due consideration.
Monetary policy
30 Apr, 2024

Monetary policy

ALIGNING its decision with the trend in developed economies, the State Bank has acted wisely by holding its key...
Meaningless appointment
30 Apr, 2024

Meaningless appointment

THE PML-N’s policy of ‘family first’ has once again triggered criticism. The party’s latest move in this...