Study IDs what brings our senses and thoughts together
Scientists have long known that the neocortex integrates what are called feedforward and feedback information streams. Feedforward data is relayed by the brain’s sensory systems from the periphery (our senses) to the neocortex’s higher order areas. These high-level brain regions then send feedback information to refine and adjust sensory processing. This back-and-forth communication allows the brain to pay attention, retain short-term memories, and make decisions.
“A simple example is when you want to cross a busy road,” said corresponding author Gyorgy Lur, Ph.D., an assistant professor of neurobiology & behavior in the School of Biological Sciences. “There are trees, people, moving vehicles, traffic signals, signs and more. Your higher-level neocortex tells your sensory system which merit attention for deciding when to go across.”
The interaction between higher- and lower-level systems also allows us to remember what you saw when you glanced both ways to gather the information. “If you didn’t have that short-term memory, you would just keep looking back and forth and never move,” he said. “In fact, if our feedforward and feedback streams weren’t constantly working together, we would do very little except respond by reflexes.”
Until now, scientists have not been sure how neurons in the brain participate in these complex processes. Lur and his colleagues discovered that feedforward and feedback signals converge onto single neurons in the parietal regions of the neocortex. The researchers also found that distinct types of cortical neurons merge the two information streams on markedly different time scales and identified the cellular and circuit architecture underpinning these differences.
“Scientists already knew that integrating multiple senses enhances neuronal responses,” Lur said. “If you only see something or just hear it, your reaction time is slower than when experiencing them with both senses simultaneously. We’ve identified the underlying mechanisms making this possible.”
He noted that the study data suggests the same principles apply if one information stream is sensory and the other is cognitive.
Understanding these processes is critical for developing future treatments for neuropsychiatric ailments like sensory-processing disorders, schizophrenia and ADHD, as well as for strokes and other injuries to the neocortex.
Lur is a fellow of the Center for the Neurobiology of Learning and Memory, the Center for Neural Circuit Mapping and the Center for Hearing Research at UC Irvine.
Ph.D. candidate Daniel Rindner, who performed all neuronal recordings and biological tissue work, served as the paper’s first author. Archana Proddutur, Ph.D., a postdoctoral scholar in the lab and second author on the paper, conducted computational modelling that led to the mechanistic understanding of the processes integrating sensory and cognitive information streams. Their research was supported by the Whitehall Foundation, National Institute of Mental Health, National Institute of Neurological Disorders and Stroke, and National Institute on Deafness and Other Communications Disorders.