SCIENCE fiction fans will be familiar with the works of Isaac Asimov, one of the most prolific writers ever to have lived. Author of over 500 books, he is perhaps best remembered for his Zeroth Law of Robotics: “A robot may not injure humanity, or, by inaction, allow humanity to come to harm.”

Unfortunately, the robots being designed to fight tomorrow's wars will not be programmed with this law. Quietly, out of public view, a debate is being conducted about the morality of using completely autonomous machines to fight the battles of the near future.

Already, unmanned aircraft such as the Predators and Reapers flying over Pakistani skies to hunt down terrorists are making it possible for their controllers to distance themselves, physically and morally, from the death and destruction they cause from thousands of miles away. For these men and women, warfare has become a video game.

But it is still a human hand that is guiding the aircraft and releasing their munitions. In the new generation of weapons systems now on the drawing boards, machines will be programmed to move across difficult terrain, carrying a mix of weapons to engage the enemy in a designated area, and take life-and-death decisions in micro-seconds.

Where human soldiers might hesitate, these robotic warriors will not pause to distinguish between friend and foe: anybody not carrying electronic markers declaring his or her identity will automatically be a target.

The advantages of using robots is evident: they will be tireless, able to carry heavy equipment, unsentimental and deadly accurate. Above all, casualties in their ranks will not make headlines back home, and lead to calls to recall them. These are the arguments of the supporters of autonomous battle-bots.

On the other side, there are those who question the ethics of using fully autonomous machines that make no distinction between soldiers and civilians. While a soldier might try and determine who the target is before firing, robots will simply shoot at anybody in the killing zone.

There will thus be no accountability if civilians are targeted. Currently, soldiers can (and occasionally are) held responsible for causing civilian deaths. But who will court-martial a machine? Nevertheless, it is tempting for military planners to prepare for this future. Terminator

Of course, we are a long way from creating the malevolent intelligence evident in the movie series where virtually indestructible human-looking androids from the future wreak havoc in our world. Artificial Intelligence is still in its early days, although the growing power and miniaturisation of computers has now placed the Holy Grail of autonomous machines in the realm of possibility.

This evolution towards automated warriors has implications other than purely moral considerations. Research and development costs render such futuristic weapons systems prohibitively expensive. The technology required also restricts the number of countries that can acquire these battle-bots to a handful.

Already, the cutting-edge weaponry in the American arsenal has left the rest of the world far behind. Only its closest allies have access to aircraft such as the F-35. As we saw by the ease with which Saddam Hussein's forces were routed by the Americans, the technological difference between the US and other forces is now virtually insuperable. The American annual military budget is almost twice that of the rest of the world combined. And despite the recession, Washington is still pouring vast sums into its armed forces.

Autonomous robots are not as distant as we once thought. Already, commercial machines perform simple domestic tasks. For years, they have been the mainstay in assembly plants where they are programmed to do repetitive tasks. But as they enter homes to assist old people, they have acquired greater sophistication and complexity.

Currently, static defence is the area that promises rapid advance. Soldiers on sentry duty, being human, succumb to boredom and sleep. Robots armed with sensors and weapons are being tested now. Perimeter defence using automated vehicles to carry out fixed or random patrols are being designed.

Interceptor aircraft would have the big advantage of not having to carry expensive and heavy life-support systems needed by human pilots. In addition, they could be made more manoeuvrable as they would be immune to the high gravitational forces (G-force) that would render humans unconscious.

All this is not in the realm of science fiction: already, a group called the International Committee for Robot Arms Control has been formed to frame rules and limits to govern the development of robots designed for the battlefield. The ICRAC includes Artificial Intelligence specialists, military officers, lawyers and human rights experts. Whatever the outcome of the deliberations of this body, it seems that the world is likely to witness wars fought between armies of largely autonomous machines, with diminishing human intervention and responsibility. Driven by technological advances as much as by political considerations, these developments will make it possible for advanced countries to wage wars without moral qualms about casualties.

However, asymmetric wars will still have a place in human conflict. As we saw from the deployment of the Stuxnet worm that attacked computers controlling Iranian uranium-enrichment facilities, increasingly networked equipment is vulnerable to cyber-attacks. Here, the playing field is more level, and countries without the money or the technology to develop the next generation of weapons systems can still hack into sophisticated command and control software, and inflict huge damage.

Underwater, dolphins are being trained to detect enemy submarines. In space, nations are prepared to destroy communication satellites As technology drives the development of ever more lethal weapons, human ethics have not kept pace. The Geneva Convention that regulates the rules of warfare was last updated in 1949, and is largely silent on the kind of armed conflict we have seen these last 60 years. The Art of War

As more and more non-state players wage war against states, and conflicts multiply, it is unlikely that we will be able to control the spread of sophisticated weapons. However, no matter how much weapons change, Sun Tzu's words, written 2,600 years ago in , remain relevant: n

“For to win 100 victories in 100 battles is not the acme of skill. To subdue the enemy without fighting is the acme of skill.”

irfan.husain@gmail.com

Opinion

Editorial

IMF’s projections
Updated 18 Apr, 2024

IMF’s projections

The problems are well-known and the country is aware of what is needed to stabilise the economy; the challenge is follow-through and implementation.
Hepatitis crisis
18 Apr, 2024

Hepatitis crisis

THE sheer scale of the crisis is staggering. A new WHO report flags Pakistan as the country with the highest number...
Never-ending suffering
18 Apr, 2024

Never-ending suffering

OVER the weekend, the world witnessed an intense spectacle when Iran launched its drone-and-missile barrage against...
Saudi FM’s visit
Updated 17 Apr, 2024

Saudi FM’s visit

The government of Shehbaz Sharif will have to manage a delicate balancing act with Pakistan’s traditional Saudi allies and its Iranian neighbours.
Dharna inquiry
17 Apr, 2024

Dharna inquiry

THE Supreme Court-sanctioned inquiry into the infamous Faizabad dharna of 2017 has turned out to be a damp squib. A...
Future energy
17 Apr, 2024

Future energy

PRIME MINISTER Shehbaz Sharif’s recent directive to the energy sector to curtail Pakistan’s staggering $27bn oil...