First-Hand Experience: Deep Learning Lets Amputee Control Prosthetic Hand, Video Games

Using GPUs, a group of researchers trained an AI neural decoder able to run on a compact, power-efficient NVIDIA Jetson Nano system on module (SOM) to translate 46-year-old Shawn Findley's thoughts into individual finger motions.

And if that breakthrough weren't enough, the team then plugged Findley into a PC running Far Cry 5 and Raiden IV, where he had his game avatar move, jump - even fly a virtual helicopter - using his mind.

It's a demonstration that not only promises to give amputees more natural and responsive control over their prosthetics. It could one day give users almost superhuman capabilities.

The effort is detailed in a draft paper, or pre-print, titled 'A Portable, Self-Contained Neuroprosthetic Hand with Deep Learning-Based Finger Control.' It details an extraordinary cross-disciplinary collaboration behind a system that, in effect, allows humans to control just about anything digital with thoughts.

'The idea is intuitive to video gamers,' said Anh Tuan Nguyen, the paper's lead author and now a postdoctoral researcher at the University of Minnesota advised by Associate Professor Zhi Yang.

Read the full article on Market Screener.

Share