Untangling mixed (neural) signals

But how these neurons communicate in between seeing and acting is a complex — and important — consideration. New research led by the Cognition and Sensorimotor Integration Lab at the University of Pittsburgh Swanson School of Engineering has uncovered how neurons encode and decode that information and differentiate between motor and sensory signals.

“We wanted to figure out how a decoder knows exactly when to initiate a movement if it is also getting signals when a movement isn’t desired,” said Uday K. Jagadisan, lead author and former graduate student in the Cognition and Sensorimotor Integration Lab. “We not only were able to uncover a reliable temporal pattern in the neuron activity that was tied to movement, but we were also able to replicate it with microstimulation.”

The researchers studied how decoding happens when the signals lead to movement, trying to differentiate it from how information is encoded during visual processing. In other words, if the neurons are receiving both sensory and motor signals, how do they tell them apart? How does the brain know when to make the body move?

“The same groups of neurons can communicate information about sensations and movement, and the brain knows which signal is which. We found it’s as if groups of neurons encode the same information in one ‘language’ to send messages about sensation and in another ‘language’ to send information about movement,” explained Neeraj Gandhi, professor of bioengineering who leads the Cognition and Sensorimotor Integration Lab at Pitt. “The receiving groups of neurons only act on one of the languages — that’s the key.”

The research is the first to both pinpoint the encoding and decoding process and verify the findings using microstimulation. The researchers were able to repeat the pattern of neural activity in non-human primate brains and elicit the intended motor reaction.

This discovery is vital for applications like brain-computer interfaces and neuroprosthetics. These artificial systems can assist people who have suffered brain injuries or other disorders that affect motor or sensory processes, but in order to work reliably, they need to decode brain activity and understand the intentions behind the patterns of activity.

“For neuroprosthetics, this research could create a way to put the brakes on and inhibit response when you don’t need it, and release when actually needed, all based on neuron chatter,” said Jagadisan. “Current technology is just delivering a pulse every few milliseconds. If you have the ability to control the time when each pulse is delivered, you can select the patterned microstimulation to achieve the effect that you want.”

https://www.sciencedaily.com/rss/all.xml