Printer Friendly

United States : Study Puts More Natural Movement for Artificial Limbs within Reach.

In new research that brings natural movement by artificial limbs closer to reality, UC San Francisco scientists have shown that monkeys can learn simple brain-stimulation patterns that represent their hand and arm position, and can then make use of this information to precisely execute reaching maneuvers.

Goal-directed arm movements involving multiple joints, such as those we employ to extend and flex the arm and hand to pick up a coffee cup, are guided both by vision and by proprioception the sensory feedback system that provides information on the body s overall position in three-dimensional space. Previous research has shown that movement is impaired when either of these sources of information is compromised.

The most sophisticated artificial limbs, which are controlled via brain-machine interfaces (BMIs) that transmit neural commands to robotic mechanisms, rely on users visual guidance and do not yet incorporate proprioceptive feedback. These devices, though impressive, lack the fluidity and accuracy of skilled natural reaching movements, said Philip Sabes, PhD, senior author of the new study, published November 24, 2014 in the Advance Online Edition of Nature Neuroscience.

State-of-the-art BMIs generate movements that are slow and labored they veer around a lot, with many corrections, said Sabes, whose research to improve prosthetics has been funded by the REPAIR (Reorganization and Plasticity to Accelerate Injury Recovery) initiative of the Defense Advanced Projects Research Agency (DARPA). Achieving smooth, purposeful movements will require proprioceptive feedback.

Many scientists have believed that solving this problem requires a biomimetic approach understanding the neural signals normally employed by the body s proprioceptive systems, and replicating them through electrical stimulation. But theoretical work by Sabes group over the past several years has suggested that the brain s robust learning capacity might allow for a simpler strategy.

The brain is remarkably good at looking for temporal coincidence things that change together and using that as a clue that those things belong together, said Sabes. So we ve predicted that you could deliver information to the brain that s entirely novel, and the brain would learn to figure it out if it changes moment-by-moment in tandem with something it knows a lot about, such as visual cues. In the new research, conducted in the Sabes laboratory by former graduate student Maria C. Dadarlat, PhD, and postdoctoral fellow Joseph E. O Doherty, PhD, monkeys were taught to reach toward a target, but their reaching arm and the target were obscured by a tabletop.

2014 Al Bawaba ( Provided by SyndiGate Media Inc. ( ).

COPYRIGHT 2014 SyndiGate Media Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2014 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Publication:Mena Report
Article Type:Report
Date:Nov 25, 2014
Previous Article:United States : RF Micro Devices and TriQuint Semiconductor Announce Closing Date for Merger.
Next Article:United States : FDA warns against using laparoscopic power morcellators to treat uterine fibroids.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters