Sunday, November 19, 2017
Home » Leading Edge Robotics News » A Nimbler Mind-Controlled Robotic Hand

A Nimbler Mind-Controlled Robotic Hand

This is an image showing one of four new hand movements from the 10D control of the robotic arm. Photo courtesy of Journal of Neural Engineering/IOP Publishing.
This is an image showing one of four new hand movements from the 10D control of the robotic arm.
Photo courtesy of Journal of Neural Engineering/IOP Publishing.

The quadriplegic woman who can control a robotic arm by thinking about the action, can now perform far more complicated movements. According to the researchers at the University of Pittsburgh involved in the study, the woman has gone from giving high fives, to giving the thumbs up sign.

This represents a significant improvement: from seven dimensions to ten in two years. The additional dimensions involve finger abduction, a scoop, thumb extension and pinching. This allows the woman, Jan Scheuermann, to perform a wider range of activities and more make precise movements. A study, outlining the improvements, was published recently in the Journal of Neural Engineering.

According to the study, the “results show that individual motor cortical neurons encode many parameters of movement.” The study also found that objects, even virtual ones, matter.  Calibration of the prosthetic hand — which was done using virtual reality software — improved if the hand closed around a virtual object.

“We hope to repeat this level of control with additional participants and to make the system more robust, so that people who might benefit from it will one day be able to use brain-machine interfaces in daily life,” said study co-author Dr. Jennifer Collinger. She says they also plan to study whether haptic feedback to the user, can further improve control of the prosthetic.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *