Decoding hand gestures using data from noninvasive brain imaging

[ad_1]

Researchers from the University of California, San Diego, have come up with a way to distinguish between hand gestures that people make by examining only data from non-invasive brain imaging, without getting information from the hands themselves. The findings are an early step in developing a non-invasive brain-computer interface that may one day allow patients with paralysis, amputated limbs, or other physical challenges to use their minds to control a device that assists with daily tasks.

The research was recently published online ahead of print in the journal cerebral cortexrepresents the best results to date in distinguishing single-handed gestures using a completely non-invasive technique, in this case, magnetoencephalography (MEG).

“Our goal was to go beyond gaseous components,” said senior author Mingxiong Huang, Ph.D., and co-director of the MEG Center at the Qualcomm Institute at UC San Diego. Huang is also affiliated with the Department of Electrical and Computer Engineering at UCSD Jacobs College of Engineering and Department of Radiology at UCSD School of Medicine, as well as the Veterans Affairs (VA) San Diego Healthcare System. “MEG provides a safe and accurate option for developing a brain-computer interface that could ultimately help patients.”

The researchers emphasized the advantages of MEG, which uses a helmet with an array of 306 sensors to detect magnetic fields produced by neural electrical currents moving between neurons in the brain. Alternative brain-computer interface technologies include electrocardiography (ECoG), which requires surgical implantation of electrodes on the surface of the brain, and scalp electroencephalography (EEG), which less accurately identifies brain activity.

With MEG, I can see the brain thinking without taking off the skull and placing electrodes on the brain itself. I just have to put the MEG helmet on their heads. There are no electrodes that could break during implantation inside the head; No expensive and delicate brain surgery; There are no possible encephalitis.”


Roland Lee, MD, study co-author, director of the MEG Center at UC San Diego Qualcomm Institute, professor emeritus of radiology at UC San Diego School of Medicine, and physician at VA San Diego Healthcare System

To me, the safety of the MEG device is likened to taking a patient’s temperature. “MEG measures the magnetic energy your brain puts out, much like a thermometer measures the heat your body puts out. This makes it completely safe and non-invasive.”

Lot

The current study evaluated the ability to use MEG to discriminate between hand gestures made by 12 volunteers. Volunteers were equipped with a MEG helmet and randomly instructed to perform one of the gestures used in the Rock Paper Scissors game (as in previous studies of this type). MEG functional information was superimposed on MRI images, which provided structural information about the brain.

To interpret the data generated, Yifeng (“Troy”) Bu, an electrical and computer engineering doctoral student in the UC San Diego Jacobs School of Engineering and first author of the paper, wrote a high-performance deep learning model called MEG-RPSnet.

“The special feature of this network is that it combines spatio-temporal features simultaneously,” Bo said. “That’s the main reason it performs better than previous models.”

When the results of the study came out, the researchers found that their techniques could be used to distinguish hand gestures with over 85% accuracy. These results were comparable to those of previous studies with a much smaller sample size using the invasive ECoG brain-computer interface.

The team also found that MEG measurements from only half of the brain regions sampled could generate results with a small loss (2-3%) of accuracy, indicating that future MEG helmets may require fewer sensors.

Looking to the future, Bo noted, “This work builds a foundation for the development of a MEG-based brain-computer interface.”

source:

Journal reference:

He buys., et al. (2023) Magnetoencephalogram-based brain-computer interface for decoding hand gestures using deep learning.. cerebral cortex. doi.org/10.1093/cercor/bhad173.

[ad_2]

Source link

Related Posts

Precaliga