G.I. robot reporting for duty: robot soldiers are a crucial part of the Army's plans for a 21st-century fighting force. But are we ready to trust machines to make life-and-death decisions on the battlefield?
Robot soldiers are coming.
The Pentagon predicts that within a decade, robots will be hunting and killing enemies in combat. These machines will be crucial to the Army's effort to rebuild itself as a 21st-century fighting force, and a $127 billion project called Future Combat Systems has become the biggest military contract in American history.
Military planners say robot soldiers will increasingly think, see, and react like humans. At first, they will be remote-controlled, looking and acting like lethal toy trucks. As their intelligence grows, so will their autonomy.
The robot soldier has been a dream at the Pentagon for 30 years. And some involved in the work say it may take at least 30 more years to realize in full. But well before then, they say, the military must answer tough moral questions if it intends to trust robots with the responsibility of distinguishing friend from foe, combatant from bystander.
Already, hundreds of robots are digging up bombs in Iraq and scouring caves in Afghanistan. This spring, a bomb-disposal robot capable of firing a thousand rounds a minute is scheduled for deployment in Baghdad. The robot is the first of its kind to take up a frontline infantry position, ready to kill enemies.
As the first lethal robots begin to arrive in Iraq, the role of the robot soldier as a killing machine has barely been debated.
"I have been asked what happens if the robot destroys a school bus rather than a tank parked nearby," says Johnson, who leads robotics efforts at the Joint Forces Command. "We will not entrust a robot with that decision until we are confident they can make it."
Pentagon officials say their ultimate goal is combat without casualties; they plan to assign as many dangerous missions as possible to the robots. Saving money is also a major goal: The median lifetime cost to the military of a human soldier is now about $4 million, according to a Pentagon study. Robots could cost less than a tenth of that.
But the history of warfare suggests that every new technological leap--for example, the longbow, the tank, the atomic bomb--outraces the strategy to control it. "There is a lag between technology and doctrine," says Robert Finkelstein, president of Robotic Technology in Potomac, Md. "If you could invade other countries bloodlessly, would this lead to a greater temptation to invade?"
It will also be a major challenge to build a soldier that looks and acts human, like the I, Robot model imagined by science-fiction author Isaac Asimov and featured in the recent movie of the same name. Bart Everett, of the Space and Naval Warfare Systems Center in San Diego, hopes to create "an android-like robot" that can perform humanlike tasks. A four-foot-tall prototype with a gun for a right arm can take aim and fire at a soda can, performing the basic tasks of hunting and killing.
RULES OF ENGAGEMENT
Decades ago, Asimov posited three rules for robots: Do not hurt humans; obey humans unless that violates Rule 1; defend yourself unless that violates Rules 1 and 2.
Colin M. Angle, chief executive of iRobot, a private company that has sold more than $70 million worth of its Roomba robotic vacuum cleaners, says the calculus of money, morals, and military logic will result in battalions of robots being sent into combat in the near future.
Will the Asimov rules still apply to these robot soldiers? "We are a long ways," says Angle, "from creating a robot that knows what that means."
Tim Weiner is a business reporter for The New York Times.
|Printer friendly Cite/link Email Feedback|
|Publication:||New York Times Upfront|
|Date:||May 9, 2005|
|Previous Article:||Taking time for a cat map.|
|Next Article:||Who we are now: Today's America is very different from the one your parents and grandparents knew. A few statistics tell the story of an...|