Cleaves NEWSWIRE [Cleaves Newswire has been decommissioned but will remain online as a resource and to preserve backlinks; new site here.] Independent Open Publishing
 
"Credit is a system whereby a person who can not pay gets another person who can not pay to guarantee that he can pay" -- Charles Dickens
» Gallery

Search

search comments
advanced search
printable version
PDF version

Pentagon exploring robot killers that can fire on their own
by Robert S. Boyd via fleet - McClatchy Newspapers Thursday, Mar 26 2009, 10:37am
international / peace/war / other press

WASHINGTON — The unmanned bombers that frequently cause unintended civilian casualties in Pakistan are a step toward an even more lethal generation of robotic hunters-killers that operate with limited, if any, human control.

killerbot.jpg

The Defense Department is financing studies of autonomous, or self-governing, armed robots that could find and destroy targets on their own. On-board computer programs, not flesh-and-blood people, would decide whether to fire their weapons.

"The trend is clear: Warfare will continue and autonomous robots will ultimately be deployed in its conduct," Ronald Arkin, a robotics expert at the Georgia Institute of Technology in Atlanta, wrote in a study commissioned by the Army.

"The pressure of an increasing battlefield tempo is forcing autonomy further and further toward the point of robots making that final, lethal decision," he predicted. "The time available to make the decision to shoot or not to shoot is becoming too short for remote humans to make intelligent informed decisions."

Autonomous armed robotic systems probably will be operating by 2020, according to John Pike, an expert on defense and intelligence matters and the director of the security Web site GlobalSecurity.org in Washington.

This prospect alarms experts, who fear that machines will be unable to distinguish between legitimate targets and civilians in a war zone.

"We are sleepwalking into a brave new world where robots decide who, where and when to kill," said Noel Sharkey, an expert on robotics and artificial intelligence at the University of Sheffield, England.

Human operators thousands of miles away in Nevada, using satellite communications, control the current generation of missile-firing robotic aircraft, known as Predators and Reapers. Armed ground robots, such as the Army's Modular Advanced Armed Robotic System, also require a human decision-maker before they shoot.

As of now, about 5,000 lethal and nonlethal robots are deployed in Iraq and Afghanistan. Besides targeting Taliban and al Qaida leaders, they perform surveillance, disarm roadside bombs, ferry supplies and carry out other military tasks. So far, none of these machines is autonomous; all are under human control.

The Pentagon's plans for its Future Combat System envision increasing levels of independence for its robots.

"Fully autonomous engagement without human intervention should also be considered, under user-defined conditions," said a 2007 Army request for proposals to design future robots.

For example, the Pentagon says that air-to-air combat may happen too fast to allow a remote controller to fire an unmanned aircraft's weapons.

"There is really no way that a system that is remotely controlled can effectively operate in an offensive or defensive air-combat environment," Dyke Weatherington, the deputy director of the Pentagon's unmanned aerial systems task force, told a news conference on Dec. 18, 2007. "The requirement for that is a fully autonomous system," he said. "That will take many years to get to."

Many Navy warships carry the autonomous, rapid-fire Phalanx system, which is designed to shoot down enemy missiles or aircraft that have penetrated outer defenses without waiting for a human decision-maker.

At Georgia Tech, Arkin is finishing a three-year Army contract to find ways to ensure that robots are used in appropriate ways. His idea is an "ethical governor" computer system that would require robots to obey the internationally recognized laws of war and the U.S. military's rules of engagement.

"Robots must be constrained to adhere to the same laws as humans or they should not be permitted on the battlefield," Arkin wrote.

For example, a robot's computer "brain" would block it from aiming a missile at a hospital, church, cemetery or cultural landmark, even if enemy forces were clustered nearby. The presence of women or children also would spark a robotic no-no.

Arkin contends that a properly designed robot could behave with greater restraint than human soldiers in the heat of battle and cause fewer casualties.

"Robots can be built that do not exhibit fear, anger, frustration or revenge, and that ultimately behave in a more humane manner than even human beings in these harsh circumstances," he wrote.

Sharkey, the British critic of autonomous armed robots, said that Arkin's ethical governor was "a good idea in principle. Unfortunately, it's doomed to failure at present because no robots or AI (artificial intelligence) systems could discriminate between a combatant and an innocent. That sensing ability just does not exist."

Selmer Bringsjord, an artificial intelligence expert at Rensselaer Polytechnic Institute in Troy, N.Y., is worried, too.

"I'm concerned. The stakes are very high," Bringsjord said. "If we give robots the power to do nasty things, we have to use logic to teach them not to do unethical things. If we can't figure this out, we shouldn't build any of these robots."

© 2009 McClatchy Newspapers

PDF Document Study claiming bots will be operating in the field by 2020

COMMENTS

show latest comments first   show comment titles only

jump to comment 1

Military killer robots 'could endanger civilians'
by staff report via krill - Telegraph UK Monday, Aug 3 2009, 10:19am

Action on a global scale must be taken to curb the development of military killer robots that think for themselves, a leading British expert said.

"Terminator"-style machines that decide how, when and who to kill are just around the corner, warns Noel Sharkey, Professor of Artificial Intelligence and Robotics at the University of Sheffield.

Far from helping to reduce casualties, their use is likely to make conflict and war more common and lead to a major escalation in numbers of civilian deaths, he believes.

"I do think there should be some international discussion and arms control on these weapons but there's absolutely none," said Prof Sharkey.

"The military has a strange view of artificial intelligence based on science fiction. The nub of it is that robots do not have the necessary discriminatory ability. They can't distinguish between combatants and civilians. It's hard enough for soldiers to do that."

Iraq and Afghanistan have both provided ideal "showcases" for robot weapons, said Prof Sharkey.

The "War on Terror" declared by President George Bush spurred on the development of pilotless drone aircraft deployed against insurgents.

Initially used for surveillance, drones such as the Predator and larger Reaper are now armed with bombs and missiles.

The US currently has 200 Predators and 30 Reapers and next year alone will be spending 5.5 billion dollars (£3.29 billion) on unmanned combat vehicles.

Britain had two Predators until one crashed in Iraq last year.

At present these weapons are still operated remotely by humans sitting in front of computer screens. RAF pilots on secondment were among the more experienced controllers used by the US military, while others only had six weeks training, said Prof Sharkey. "If you're good at computer games, you're in," he added.

But rapid progress was being made towards robots which took virtually all their own decisions and were merely "supervised" by humans.

These would be fully autonomous killing machines reminiscent of those depicted in the "Terminator" films.

"The next thing that's coming, and this is what really scares me, are armed autonomous robots," said Prof Sharkey speaking to journalists in London. "The robot will do the killing itself. This will make decision making faster and allow one person to control many robots. A single soldier could initiate a large scale attack from the air and the ground.

"It could happen now; the technology's there."

A step on the way had already been taken by Israel with "Harpy", a pilotless aircraft that flies around searching for an enemy radar signal. When it thinks one has been located and identified as hostile, the drone turns into a homing missile and launches an attack - all without human intervention.

Last year the British aerospace company BAe Systems completed a flying trial with a group of drones that could communicate with each other and select their own targets, said Prof Starkey. The United States Air Force was looking at the concept of "swarm technology" which involved multiple drone aircraft operating together.

Flying drones were swiftly being joined by armed robot ground vehicles, such as the Talon Sword which bristles with machine guns, grenade launchers, and anti-tank missiles.

However it was likely to be decades before such robots possessed a human-like ability to tell friend from foe.

Even with human controllers, drones were already stacking up large numbers of civilian casualties.

As a result of 60 known drone attacks in Pakistan between January 2006 and April 2009, 14 al Qaida leaders had been killed but also 607 civilians, said Prof Sharkey.

The US was paying teenagers "thousands of dollars" to drop infrared tags at the homes of al Qaida suspects so that Predator drones could aim their weapons at them, he added. But often the tags were thrown down randomly, marking out completely innocent civilians for attack.

Prof Sharkey, who insists he is "not a pacifist" and has no anti-war agenda, said: "If we keep on using robot weapons we're going to put civilians at grave risk and it's going to be much easier to start wars. The main inhibitor of wars is body bags coming home.

"People talk about programming the 'laws of war' into a computer to give robots a conscience, so that if the target is a civilian you don't shoot. But for a robot to recognise a civilian you need an exact specification, and one of the problems is there's no specific definition of a civilian. Soldiers have to rely on common sense.

"I'm not saying it will never happen, but I know what's out there and it's not going to happen for a long time."

Matthew Knowles, from the aerospace, defence and security trade association SBAC, said: "Scare stories such as this are not helpful contributions to what is an important debate. The convention is that any decision to take a life using unmanned vehicles, which is of course a very serious choice to make, is carried out by a properly trained military human operator."

© 2009 Telegraph Media Group Limited


 
<< back to stories
 

© 2005-2024 Cleaves Alternative News.
Unless otherwise stated by the author, all content is free for non-commercial re-use, reprint, and rebroadcast, on the net and elsewhere.
Opinions are those of the contributors and are not necessarily endorsed by Cleaves Alternative News.
Disclaimer | Privacy [ text size >> ]