15.1 C
New York

Paralyzed man moves robotic arm with his thoughts

Published:

Paralyzed Man moves robotic arm with his thoughts (19459000)

Researchers at UC San Francisco were able to help a paralyzed man control a robotic hand using a device that sent signals from his brain to a PC.

By imagining himself performing actions, he was able to grab, move and drop items.

This device, called a brain-computer interaction (BCI), has worked for 7 months without any adjustments. Up until now, these devices only worked for a few hours.

BCI relies upon an AI model which can adapt to the small changes in the brain that occur as a person repeats the movement — or, in this case an imagined movement — to learn to do it more precisely. Karunesh GANGULY, MD, Ph.D., a professor at the UCSF Weill Institute for Neurosciences and a neurologist said, “

“This blending of learning between humans and AI is the next phase for these brain-computer interfaces,” ” The study was funded by the National Institutes of Health and appeared in on March 6.

It was the discovery that activity in the brain changes day-to-day as a participant imagined performing specific movements repeatedly. Once the AI had been programmed to account those shifts, it continued to work for months.

Location

Ganguly examined how brain activity patterns in animals represented specific movements. He found that these representations changed from day to day as the animal learned. He suspected that the same thing happened in humans and that’s why their BCIs lost the ability to recognise these patterns so quickly.

Ganguly, along with neurology researcher Nikhilesh Natraj PhD, worked closely with a participant in the study who had suffered a stroke many years ago. He was unable to speak or move. He had tiny sensors embedded on the surface his brain, which could detect brain activity when he pictured moving.

Ganguly asked his participant to imagine moving various parts of his body such as his hands, feet, or head.

Even though the participant couldn’t move, his brain still produced signals for a motion when he imagined doing it. The BCI recorded his brain’s representations through the sensors in his brain.

Ganguly’s team found that representations in the mind remained the same shape, but their location changed slightly from day-to-day. From virtual to real

Ganguly asked the participant to imagine making simple movements using his fingers, hands, or thumbs for two weeks while the sensors recorded the brain activity.

Next, the participant attempted to control a robot arm and hand. The movements were still not very precise.

Ganguly then had the participant practice using a virtual robotic arm to give him feedback on his visualizations. He eventually got the virtual arm do what he wanted. After a few sessions of practice with the real robotic arm, the participant was able to transfer his skills into the real world.

The participant was able to make the robotic arm pick blocks up, turn them around and move them in new directions. He could even open a cabinet and take out a glass to place it in front of a water dispenser.

After a 15-minute adjustment period “tune-up” the participant was able to still control the robotic hand months later.

Ganguly has now refined the AI models in order to make the robotic hand move faster and smoother, and plans to test the BCI at home.

The ability to feed oneself or drink water would be a life-changing experience for people with paralysis. Ganguly believes this is possible. He said.

www.roboticsobserver.com

Related articles

spot_img

Recent articles

spot_img