Printer Friendly

Robots: hammers or horsers?

Is an autonomous robot more like a hammer or a horse?

The question sounds like the set-up to a joke. Yet the men and women arguing were not comedians. They were lawyers and technologists.

The venue was University of Miami School of Law's We Robot conference in April, the first meeting ever devoted to the legal and policy implications of autonomous robots.

According to conference organizer A. Michael Froom kin, autonomous robots are a transformative technology fast approaching takeoff.

How fast? When the Defense Advanced Research Projects Agency held its first driverless car grand challenge in 2004, the smartest vehicle traveled just seven miles along the 150-mile-long course.

Eight years later, Nevada licensed the Google Driverless Car to operate on state roads. Google Car is not an outlier. Autonomous robots have begun to work their way into factories, warehouses, hospitals, and war.

Froomkin has been there before. He got involved in Internet law in the early 1990's. "That was early for most people, but the key standards had already been deployed," he told the conference.

"Some early design choices involved the domain system, privacy, and security issues. We could have avoided a significant fraction of later problems if the engineers who made technology choices had thought about these issues."

Froomkin wants lawyers and policy makers to scout out the broader issues in robotics now, before engineers set standards. That involves defining how autonomous robots differ from other tools, and drawing analogies that illuminate how the law should think about them.

In other words, are they more like hammers or horses?

The question touches on a much broader concept, liability. If an autonomous robot is like a hammer, the person wielding it is responsible for any damage it does. If it is like a horse, capable of independent action, there are limits on liability. After all, the owner is not always liable if a well-trained horse bolts and injures someone during a storm.

William Smart, co-director of the engineering masters program in robotics at Washington University, believes today's robots are like hammers. "People build and program robots," he said.

Yet this is bound to change. Humans, Smart explained, have agency. They act independently to accomplish their ends. Tools do not.

Robots lie somewhere between the two. "They are tools so sophisticated, they seem to have agency," he said.

Smart thinks robots will grow less hammer-like as they become more sophisticated.

"It's a hammer if you can tell me what's going to happen based on the input," Smart said. That is already not the case with autonomous robots.

"It is possible that two people with the same robot can give it the same instructions in different environments, and the robots will act differently," he said.

In fact, a single autonomous robot in the same environment may behave differently at different times. As one member of the audience noted, sensors are never 100 percent accurate, so robots always act on imperfect information. Moreover, the next generation of robots will learn from experience and reprogram themselves.

The result is a robot that is deterministic in some ways and not in others. "You can tell a robot to go to another room, and that's deterministic. But you cannot predict what path it will take," Smart said. In other words, it will begin to act more like a horse than a hammer.

Which brings us back to liability law. After all, if a knife-wielding, sandwich-making robot suddenly malfunctions, who is responsible?

Is it the person who ordered it to pick up the knife? The person who sold it? The company that programmed it? The team that developed its AI system? The half-dozen companies who made its sensors? The dozens of people who contributed to its open-source operating system?

In liability suits, lawyers typically go after the parties with the deepest pockets. But many smaller firms and even independent researchers could become collateral damage as lawsuits move through the courts.

By tackling whether robots are more like hammers or horses, Froomkin hopes to provide legal guidance for robotics researchers.

The questions posed by robots are complex, and likely to evolve as rapidly as the robots themselves. It is a good thing Froomkin's fellow lawyers are addressing these issues now, rather than waiting for the first robotic sandwich maker to fail.

The author is an associate editor of Mechanical Engineering.
COPYRIGHT 2012 American Society of Mechanical Engineers
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2012 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:INPUT OUTPUT
Author:Brown, Alan S.
Publication:Mechanical Engineering-CIME
Geographic Code:1USA
Date:Sep 1, 2012
Words:722
Previous Article:New builds among key topics at nuclear technology seminars.
Next Article:Analysis of developments in automation.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters