MnRI Seminar: Zhi Yang
A High-Performance Brain-Machine Interface for Upper Limb Amputees
While prosthetic arms with independently actuated digits have become commercially available, there is a lack of high-performance human-machine interfaces (HMIs) that can decode the brain's complex motor intention. Current HMIs merely support sequential choices within a small amount of preprogrammed hand grasp patterns. As a result, the amputee users often do not report sufficient improvement in their daily activities to make a prosthesis useful.
In this talk, he will present an HMI based on the work we have done at the University of Minnesota. A pilot clinical study with upper limb amputees has been carried out, where the proposed HMI can capture weak neural signals from the residual nerves in the amputated limb while the algorithms can decode the complex human motor intentions accordingly. So far, the HMI has survived the body environment for more than a year and allowed transradial amputees to dexterously mind-control a prosthetic hand with 15 degrees of freedom.
The results suggest an unprecedented amount of information being decoded from the nerves, a conduit for connecting the brain to the machines through the peripheral neural pathways.
About Zhi Yang
Zhi Yang is an Associate Professor of Biomedical Engineering at the University of Minnesota. Professor Yang’s research interests are implantable devices, neuro-artificial intelligence, and neuromodulation therapies. In the past ten years, he has developed a cortical technology to treat Parkinson’s disease patients, and a peripheral nerve technology to help amputees. His work has been used to establish an advanced brain-machine interface in the DARPA HAPTIX program. He is a recipient of NSF CAREER Award in 2019. More recently, he starts to work with medical device companies to commercialize lab research.