Printer Friendly

Underwater robots get touchy-feely.

Remotely controlled robots can go where humans cannot. Underwater, they can clear biohazards, disarm mines, help in the exploration of gas and oil, or conduct environmentally sensitive research.

WHETHER IT IS DISASSEMBLING A WEAPON or muscling closed a valve, these jobs are all things people would do with their hands. Now a University of Washington startup aims to give operators of underwater robots this same sense of touch.

The company, BluHaptics, was founded by three UW professors, electrical engineer Howard Chizeck, commercialization fellow and research associate Fredrik Ryden, and applied physicist Andy Stewart.

BluHaptics wants to build robots that can do complex tasks more naturally. Like all haptics systems, sensors in robot grippers and effectors deliver force feedback through the handle of a controller. The sensory information can help operators avoid objects and guide the robot through a task.

"The sense of human touch is key to human dexterity," Ryden said. "If you don't think so, just look at what happens when people lacking a sense of touch try to tie their shoe laces."

A control system that anticipates contact lies at the heart of the robot's haptic capabilities. As the robot closes in on a target, non-contact sensors, such as lasers and sonar, ping the structures around them. This creates a cloud of points that the software fuses into a video map of the environment. Users can rotate and manipulate the map, like a 3-D CAD image, to plot how they will navigate the robot to the target.

They can also surround objects with 'virtual fixtures,' essentially force fields that constrain spatial motion. The simplest virtual fixture acts like a virtual straightedge, helping operators move the robot in a straight line. When the operator deviates from this path, the fixture sends force feedback to the joystick or haptic controller to resist the motion and guide the robotic arm back into the proper alignment.

Virtual fixtures can also prevent operators from crashing robots into objects they want to avoid. This helps them navigate the robot through a complex environment, such as the pipelines under an offshore platform, while minimizing unintended damage. Like their physical counterparts, virtual fixtures could guide a robotic welder to a subsea pipe white keeping it away from sensitive valves.

Ryden and Chizeck did not set out to build underwater robots. Instead, in 2010, they began working on a haptics system for a surgical robot under development in Chizeck's Biorobotics Lab. They saw that they could apply many of their surgical innovations to underwater robotic control.

BluHaptics now seeks to build on its first robot, which combines its proprietary control technology with an out-of-thebox robotic manipulator. For example, it is working with companies that already deploy subsea robots to develop templates of common underwater tasks into its software. This will make it easier for less experienced operators to manage the robots effectively.

Ryden is also building a new interface that fuses data from several sensors into a single head-mounted display. It will enable robot pilots and supervising engineers to collaborate more closely on complex subsea tasks, and put their new sense of touch to better use.


WHAT IT IS: Haptic-feedback system for underwater robots.

DEVELOPER: BluHaptics.

HOW IT WORKS: Laser and sonar data create a virtual force field to help the operator stay away from critical areas.
COPYRIGHT 2014 American Society of Mechanical Engineers
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2014 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:TECH BUZZ
Author:Pero, James
Publication:Mechanical Engineering-CIME
Date:Aug 1, 2014
Previous Article:Globally shoddy.
Next Article:Light vehicles' lightweight future.

Terms of use | Privacy policy | Copyright © 2022 Farlex, Inc. | Feedback | For webmasters |