WASHINGTON: The United States is deploying missile-laden remotely piloted aircraft to kill enemies in six countries, scientists are working on ever more sophisticated military robots, and there are a host of unanswered questions on the future of warfare. Some of the more intriguing ones are asked abroad.
Such as: “Is the Reaper operator walking the streets of his home town after a shift a legitimate target as a combatant? Would an attack (on him) by a Taliban sympathiser be an act of war under international law or murder under the statutes of the home state? Does the person who has the right to kill as a combatant while in the control station cease to be a combatant on his way home?”
This comes from a study (http://tinyurl.com/6cj99wa) by Britain's Ministry of Defence and refers to the air war waged by US pilots who operate, from bases in the United States, heavily-armed drones flying over Afghanistan or Pakistan 7,500 miles away. The Reaper is the workhorse of the drone fleet, which has grown from around 50 a decade ago to more than 7,000 today. It is increasing at a fast clip, unaffected by defense spending cuts in other areas.
Most of the drone missions for the military are flown from Creech Air Force base near Las Vegas. The Central Intelligence Agency (CIA) has a separate, covert, program that critics see as targeted assassinations. The CIA's drones are operated from northern Virginia. The pilots, sitting in cockpits in front of television monitors, run no physical risks whatever, a novelty for men engaged in war.
Debate over the remote-control air wars - drones are now in action over Afghanistan, Pakistan, Yemen, Iraq, Libya and Somalia - has been largely confined to academia and think tanks, both civilian and military. But reports this week that the CIA had extended drone strikes to Somalia have prompted calls for a closer examination of where war ends and assassinations begin.
It is not an issue, however, that strikes a chord with the public and US politicians are largely in favor of drone strikes. They are seen as an inexpensive way of targeting enemies, with no risk to the lives of American personnel. The downside to the seemingly risk-free elimination of Taliban fighters, al Qaeda militants and assorted other anti-American elements is of little apparent concern in the US.
What downside? High technology and precision weapons notwithstanding, the “surgical strikes” drone enthusiasts like to talk about are on occasion anything but, resulting in “collateral damage”, the euphemism for dead civilians.
Collateral damage tends to create more recruits to anti-American causes. Even without civilian casualties, remote-control warfare tarnishes the image of the United States, and the few close allies who use drones, in the countries where they are fighting.
“The West ... is seen as a cowardly bully that is unwilling to risk his own troops but is happy to kill remotely,” the British study noted.
Science And Wisdom
Such sentiments are unlikely to sway public opinion in the West, nor will they stop weapons developments that bring to mind an observation by the late science fiction writer Isaac Asimov more than four decades ago: “The saddest aspect of life right now is that science gathers knowledge faster than science gathers wisdom.”
Which brings us to aspects of 21st century war that go beyond the pros and cons of unmanned aerial vehicles (UAVs), as drones are also known. While they are frequently referred to as “killer robots,” they are “human-in-the-loop” weapons, so named because a human being navigates the aircraft and pushes the button that fires the missile.
If and when to cut the human out of the loop - and open a new era of warfare - is a matter of debate between scientists.
“It ... would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, without recourse to a higher, human authority,” according to the British study.
That would mean, in effect, outsourcing life-and-death decisions to computer programs controlling both aerial and ground-based robots. Questions yet to be answered are complex and varied: How do you get a robot to tell an insurgent from an innocent? Can you program the Laws of War and the Rules of Engagement into a robot? Can you imbue a robot with his country's culture?
If something goes wrong, resulting in the death of civilians, who will be held responsible? The robot's manufacturer? The designers? Software programmers? The commanding officer in whose unit the robot operates? The US president who gives the green light?
A number of scientists alarmed by such unanswered questions last September formed a group, the International Committee for Robot Arms Control, that is pressing for an international debate on the regulation and control of armed military robots. The prospect of that happening looks remote.