I just read an excellent article at i09.com about the future of robotic warfare – especially the autonomous kind. Believe it or not the technology exists and is being used right now.
Autonomous killing machines aren’t anything new. We already have various levels of autonomy in a number of weapons systems, including cruise and patriot missiles. The Aegis Combat System, which is found aboard naval ships, has an autonomous mode in which it uses powerful computers and radars to track and guide weapons to destroy enemy targets.
But these are largely defensive systems with limited intelligence. They typically find their way to a target, or take certain action without human oversight – but only after being launched or triggered by a human.
As time passes, however, these systems are getting more sophisticated, and their potential for increased autonomy is growing. Take Samsung Techwin’s remote-operated sentry bot, for example, that works in tandem with cameras and radar systems. Working in the Korean DMZ, the device can detect intruders with heat and motion sensors and confront them with audio and video communications. They can also fire on their targets with machine guns and grenade launchers. As of right now, the robots cannot automatically fire on targets, requiring human permission to attack. But a simple change to engagement policy could override all that.
Another example are the packbots used by the U.S. military. These devices have an attachment called REDOWL which uses sensors and software to detect the location of a sniper. Once it detects the threat, it shines a red laser light on it, indicating its presence to human soldiers who can then choose to take it out. It wouldn’t take much to modify this system such that the REDOWL could act on its own – and with its own weapons.
Other than the obvious reasons for why this technology is terrifying, there are others – namely how cheap and easy this technology is for anyone to use. It isn’t like the cold war where there was a significant technology barrier to entry to make nuclear weapons – anyone with the basic know how can weaponize a robot. Example, right now I can go into most hobby shops and by a robot kit for my son.
Here’s more from i09.com:
Wallach told io9 that these systems aren’t very complicated and that virtually any country has the potential to develop their own versions. “The larger question,” asks Wallach, “is whether or not the U.S. military is producing such weapons – and other countries.” He suspects that there are more than 40 countries now involved in developing unmanned vehicle programs similar to the ones deployed by the United States, including drones.
Complicating the issue are ever-increasing levels of autonomy in military machines. The U.S. airforce is starting to change the language surrounding their engagements, referring to systems that are “in the loop” to “on the loop” to describe the level of future human involvement. By being “on the loop”, humans are largely outside of the process, but can intervene if the weapons system is about to do something inappropriate. Trouble is, says Wallach, is that the speed of modern warfare may preclude human involvement. “It’s dubious to think that a human can always react in time,” he says.
And take REDOWL, for example. Once the system points out an enemy sniper, the question emerges: Who is in whose loop? Is the soldier in the REDOWL’s loop, or vice-versa?
“A common concern among some military pundits is that it lowers the barriers to starting new wars,” says Wallach, “that it presents the illusion of a quick victory and without much loss of force – particularly human losses.” It’s also feared that these machines would escalate ongoing conflicts and use indiscriminate force in the absence of human review. There’s also the potential for devastating friendly fire.
And once developed, the systems are likely to proliferate widely. The fear is that their presence would introduce a serious, unpredictable element in future conflicts. Just because, say, the United States adheres to international laws and restraints doesn’t mean that other state actors and interests will, too. It could very well instigate an arms race.
What is also missing from the discussion is who are the new causalities in this kind of warfare? How do you win a war where both sides are hiding in bunkers and controlling or NOT controlling autonomous robots? How do they determine the winner? By innocent civilian casualties?
Wallach is alarmed at how little this issue is being discussed, which is something that he’s hoping to change. “There are various policy makers, military thinkers, and academics who suggest that autonomous killing machines are science fiction and that no one is moving in that direction,” he notes. Wallach cites the work of Werner Dahm, chief scientist of the Air Force, who he feels is not taking the issue seriously enough – and even potentially downplaying the threat.
Quite understandably, some military thinkers see the tremendous advantage that these systems could bring. Unmanned smart weapons could increase capabilities, reduce collateral damage through greater precision, decrease loss of personnel, lower manpower costs, and enable the projection of lethal force in a future where manpower resources will be far more limited.
And for better or worse, these rationales point to a future in which wars are fought by robots pitted against each other. “This is not just the concern of futurists or nay-sayers,” says Wallach, “but also from both retired generals and active military leaders who are very concerned that this could lead to a robust lack of control and undermine the human levels of engagement.”
And we haven’t even discussed the potential for these systems to be hacked. Remember, this is no longer sci-fi. It’s happening right now.
In his conversation with io9, Wallach seemed frustrated that some people see this issue as something that’s too futuristic to care about. “We’re at a potential inflection point in the development of autonomous weaponry,” he said. “That inflection point won’t last for a long period of time, and if we wait too long, other vested interests will take over that prospect.”
For the full article go here.