Study Sheds New Light on How the Brain Processes Speech

It could lead to better understanding of language-based disorders such as dyslexia.

When it comes to the study of human physiology, few organs are more complex and intriguing than the human brain. Even though many areas of the brain are still a mystery to scientists, new research from Northeastern University, the University of Oxford, and several Boston institutions may help to bring researchers a step closer to understanding one of its most crucial functions: speech.

In a study published in February in the Proceedings of the National Academy of Sciences, researchers concluded that human speech preferences are not determined by motor function as it’s been previously believed. Instead, these preferences are a result of what the study’s lead author, Iris Berent, calls the brain’s “abstract rules” for language. We prefer certain sound patterns over others not because they are easier to say, Berent says, but because those sound patterns conform to the brain’s linguistic guidelines. The conformity to these rules is what then triggers motor function, she says.

Berent, a psychology professor at Northeastern, uses the example of the sound patterns “blog” and “lbog” to illustrate her point. “Blog” is agreeable to us, she says, while “lbog” is not. Though this particular example comes from the English language, Berent says that the preference for such sound patterns can be observed in every known human language. Even infants, who have no prior experience with either sound pattern, prefer “blog,” she says.

Berent and her colleagues—which includes researchers from Beth Israel Deaconess Medical Center, Harvard Medical School, and Brigham and Women’s Hospital—came to these conclusions after exposing study participants to a procedure known as transcranial magnetic stimulation, or TMS. The procedure delivers electric currents to the brain in order to disrupt activity in certain areas. According to the study, researchers wanted to assess whether disrupting activity in the area of the brain that controls lip movement would eliminate the preference for certain sound patterns. They found that although the administration of TMS impacted the participants’ speech perception, their preference for syllables such as “blog” remained intact.

Berent says that the study is significant because it provides new insight into “what language is and how it is represented in the brain.”

“It tells us how language works, [which is] something that really defines us as human beings. The question is what allows us as humans to have this really unique capacity,” Berent says. “It might also eventually lead to practical implications when it comes to disorders that compromise the motor system, which is something we want to study directly.”

One of the disorders that Berent says she plans to focus on in future studies is dyslexia, which—according to the NIH—is a learning disability that affects a person’s reading skills. “The broad point is that the motor system is engaged in language,” she says. “We’re trying to figure out exactly where the system is engaged. If we can identify that, then we can go to disorders that we know are impacted by a particular component and ask what the motor system is doing in this disorder.”