Controlling your own limbs.
Cornell University researcher Maryam Shanechi, PhD, assistant professor of electrical and computer engineering, working with Ziv Williams, MD, assistant professor of neurosurgery at Harvard Medical School, is bringing brain-machine interfaces to the next level. Instead of signals directing a device, Shanechi hopes to help people who are paralyzed move their own limbs, just by thinking about it.
The Power of Thought
When paralyzed patients plan a movement, neurons in the brain's motor cortical areas still activate even though the communication link between the brain and muscles is broken.
By implanting sensors in these brain areas, neural activity can be recorded and translated to the patient's desired movement using a mathematical transformation called the decoder. These interfaces allow patients to generate movements directly with their thoughts.
In a paper published online Feb. 18, in Nature Communications (nature.com/ncomms), Shanechi, Williams and colleagues describe a cortical spinal prosthesis that directs "targeted limb movement" in paralyzed primates.
The research team developed and tested a prosthesis that connects two primates by enabling one to send its recorded neural activity to control a different, temporarily-sedated primate. The demonstration is a step forward in making brain-machine interfaces for paralyzed humans to control their own limbs using their brain activity alone.
The brain-machine interface is based on a set of real-time decoding algorithms that process neural signals by predicting their targeted movements.
In the experiment, one animal acted as the controller of the movement, then "decided" which target location to move to, and generated the neural activity that was decoded into this intended movement. The decoded movement was used to directly control the limb of the other animal by electrically stimulating its spinal cord.
"The problem here is not only that of decoding the recorded neural activity into the intended movement, but also that of properly stimulating the spinal cord to move the paralyzed limb according to the decoded movement," Shanechi says.
The scientists focused on decoding the target endpoint of the movement as opposed to its detailed kinematics (motion). This allowed them to match the decoded target with a set of spinal stimulation parameters that generated limb movement toward that target.
They demonstrated that the alert animal could produce two-dimensional movement in the sedated animal's limb--a breakthrough in brain-machine interface research.
"By focusing on the target endpoint of movement as opposed to its detailed kinematics, we could reduce the complexity of solving for the appropriate spinal stimulation parameters, which helped us achieve this 2-D movement," Williams says.
More Research Needed
Part of the experimental setup's novelty was using two different animals, rather than just one with a temporarily paralyzed limb.
That way, the scientists contend that they have a true model of paralysis, since the master animal's brain and the sedated animal's limb had no physiological connection, as is the case for a paralyzed patient.
Shanechi's lab will continue developing more sophisticated brain-machine interface architectures with principled algorithmic designs and use them to construct high-performance prosthetics. These architectures could be used to control an external device or the native limb.
"The next step is to advance the development of brain-machine interface algorithms using the principles of control theory and statistical signal processing," Shanechi says. "Such brain-machine interface architectures could enable patients to generate complex movements using robotic arms or paralyzed limbs."
For more information, visit news.cornell.edu or hms.harvard.edu.
ANNE JUE/CORNELL UNIVERSITY