Robert S. Boyd
McClatchy Newspapers 
Thursday, March 26, 2009
WASHINGTON — The unmanned bombers that frequently cause unintended civilian casualties in Pakistan are a step toward an even more lethal generation of robotic hunters-killers that operate with limited, if any, human control.
The Defense Department is financing studies of autonomous, or self-governing, armed robots that could find and destroy targets on their own. On-board computer programs, not flesh-and-blood people, would decide whether to fire their weapons.
“The trend is clear: Warfare will continue and autonomous robots will ultimately be deployed in its conduct,” Ronald Arkin, a robotics expert at the Georgia Institute of Technology in Atlanta, wrote in a study commissioned by the Army.
(ARTICLE CONTINUES BELOW)
“The pressure of an increasing battlefield tempo is forcing autonomy further and further toward the point of robots making that final, lethal decision,” he predicted. “The time available to make the decision to shoot or not to shoot is becoming too short for remote humans to make intelligent informed decisions.”
Autonomous armed robotic systems probably will be operating by 2020, according to John Pike, an expert on defense and intelligence matters and the director of the security Web site GlobalSecurity.org in Washington.
- A d v e r t i s e m e n t
This prospect alarms experts, who fear that machines will be unable to distinguish between legitimate targets and civilians in a war zone.
“We are sleepwalking into a brave new world where robots decide who, where and when to kill,” said Noel Sharkey, an expert on robotics and artificial intelligence at the University of Sheffield, England.
Full story here.