HIGHLAND HEIGHTS, Ky. — Inside a lab at Northern Kentucky University, students and professors from multiple disciplines are working on a project to make life easier — and faster, and less intrusive — for arm amputees.
The robotic arm is 3D-printed and controlled using WiFi and a user's brain waves.
"Our solution was to develop a hybrid brain-computer interface and a mind-control prosthetic arm," said Mahdi Yazdanpour, assistant professor of mechatronics engineering technology.
The Amputee Coalition reports 185,000 new amputations happen in the U.S. each year. Current commercially available robotic arms cost tens of thousands of dollars.
Yazdanpour saw an opportunity for a better solution. He has been working on the project for almost three years. His team received a National Institutes of Health research grant and started designing what he calls the new-generation prosthetic arm.
"Our approach is completely noninvasive, so we don't need any medical procedures or any surgery," he said.
He said many robotic arms require sensor implants for electromyography-based (EMG) systems to operate. But the NKU team is working with electroencephalography (EEG) systems. Just thinking about making a fist or opening the hand can make the bionic arm move that way.
"Brain-computer interface means you can have connection between the human brain and any machines or devices, any electromechanical devices," Yazdanpour said. "It's a very cutting-edge research topic."
Assistant professor of data science Yangyang Tao is part of the project, too. He brings artificial intelligence and computing ideas to the team.
Right now, the user wears an EEG cap that connects to a computer, which captures the brain signals and codes them. Once it has learned the signals, the user still wearing the cap simply thinks about the hand's motion to trigger the mechanics inside.
"If you're thinking about like open and close arm, the arms can move in real-time very fast just like you're controlling your real arm," he said.
The goal is to have sensors built into everyday hats and, once the arm has learned brain signals, to use WiFi to move the arm.
"Imagine, you have a baseball cap. At the same time, you have small electrodes, and we can capture your EEG signals," Yazdanpour said.
The team is working on ways to take their design beyond just hand movements. The next phase of their research is to include pressure and heat sensors on the fingertips, which would transmit to a sensor on the skin and indicate pressure, hot and cold sensations.
"I watched Terminator as a middle school student, so I feel like this is fascinating," Tao laughed. "I feel like I'm actually doing something like the Terminator and we're creating a robot and am doing some smart stuff."
The NKU team is already working to receive another NIH grant to continue advancements, while still fine-tuning the current robotic hand project.