Science

Neuroscientists Solve the Ancient Mystery of How the Brain Predicts Speech

Monkeys and humans have something in similar with predictive texts.

Flickr / kolix

In a study published Wednesday in PLOS Biology, a team of neuroscientists reveal that they have figured out what mechanisms within the brain are involved when processing speech.

Study co-author Dr. Yuki Kikuchi explains in a statement that the mechanisms for speech in the brain “work like predictive text on your mobile phone, anticipating what you are going to hear next.”

In order to discover exactly what is going on when prediction happens, he and his team evaluated three adult neurosurgical patients and two adult male rhesus macaques. They exposed the two groups to a sequence of nonsense words from a made-up language that was still created with some forms of regularities, so it was possible to see the ordering relationships between the words. By using a nonsense language they were able to put the humans and monkeys on more of a level playing field — neither group was at an advantage, but the goal was that both groups would pick up on learned word contingencies.

At the same time, the scientists measured the neural oscillatory signatures that were happening in the auditory cortex of the monkey and human brains. Neural oscillations are rhythmic neural activity driven by interactions between neurons, and are believed to have an influential role in physiological processes and behavior. The brain scan results revealed that both the monkeys and the humans demonstrated “strikingly similar hierarchically nested low-frequency phase and high-gamma amplitude coupling.” In other words, the neurons in both animal’s brains responded to speech sounds and, in turn, attempted to predict the next sounds in nearly the same way.

This similarity suggests to the scientists that the human auditory cortex isn’t uniquely specialized to respond to human speech, but instead acts as a neurological mechanism that shares an evolutionary base with macaques. Neural oscillatory coupling happened in the brains of both species, which allowed for both animals to have a better chance for monitoring the predictability of speech.

“Being able to predict events is vital for so much of what we do every day,” co-author Professor Chris Petkov of Newcastle University said in a statement. “Now that we know humans and monkeys share the ability to predict speech we can apply this knowledge to take forward research to improve our understanding of the human brain.”

This ability of neurons to coordinate in groups and anticipate events is something that is impaired for people with disorders like dyslexia, schizophrenia, and Attention Deficit Hyperactivity Disorder. Why the predictive signals in the brains of these people won’t work in the same way together is still unknown, but Petkov and Kikuchi hope their research could eventually lead to new treatments and prognosis techniques for these conditions.

Abstract: Learning complex ordering relationships between sensory events in a sequence is fundamental for animal perception and human communication. While it is known that rhythmic sensory events can entrain brain oscillations at different frequencies, how learning and prior experience with sequencing relationships affect neocortical oscillations and neuronal responses is poorly understood. We used an implicit sequence learning paradigm (an “artificial grammar”) in which humans and monkeys were exposed to sequences of nonsense words with regularities in the ordering relationships between the words. We then recorded neural responses directly from the auditory cortex in both species in response to novel legal sequences or ones violating specific ordering relationships. Neural oscillations in both monkeys and humans in response to the nonsense word sequences show strikingly similar hierarchically nested low-frequency phase and high-gamma amplitude coupling, establishing this form of oscillatory coupling — previously associated with speech processing in the human auditory cortex — as an evolutionarily conserved biological process. Moreover, learned ordering relationships modulate the observed form of neural oscillatory coupling in both species, with temporally distinct neural oscillatory effects that appear to coordinate neuronal responses in the monkeys. This study identifies the conserved auditory cortical neural signatures involved in monitoring learned sequencing operations, evident as modulations of transient coupling and neuronal responses to temporally structured sensory input.
Related Tags