A new study finds that incoming speech sounds are connected by our brain to our knowledge of grammar, which is quite abstract in nature. But the big question is how does the brain process complex grammatical structures?
A group of researchers from the Max Planck Institute of Psycholinguistics and Radboud University in Nijmegen discovered that the brain encodes the structure of sentences ('the vase is red') and phrases ('the red vase') into various neural firing patterns.
The findings of the neuroimaging study were published in the PLOS Biology.
Also read: Do you know your brain's neurons function even in sleep?
How does the brain represent sentences? This is one of the fundamental questions in neuroscience, because sentences are an example of abstract structural knowledge that is not directly observable from speech. While all sentences are made up of smaller building blocks, such as words and phrases, not all combinations of words or phrases lead to sentences.
In fact, listeners need more than just knowledge of which words occur together: they need abstract knowledge of language structure to understand a sentence. So how does the brain encode the structural relationships that make up a sentence?
Lise Meitner Group Leader Andrea Martin already had a theory on how the brain computes linguistic structure, based on evidence from computer simulations. To further test this 'time-based' model of the structure of language, which was developed together with Leonidas Doumas from the University of Edinburgh, Martin and colleagues used EEG (electroencephalography) to measure neural responses through the scalp.
In a collaboration with first author and PhD candidate Fan Bai and MPI director Antje Meyer, she set out to investigate whether the brain responds differently to sentences and phrases, and if this could hint at how the brain encodes abstract structure.
The researchers created sets of spoken Dutch phrases (such as de rode vaas 'the red vase') and sentences (such as de vaas is rood 'the vase is red'), which were identical in duration and number of syllables, and highly similar in meaning. They also created pictures with objects (such as a vase) in five different colours. Fifteen adult native speakers of Dutch participated in the experiment.
For each spoken stimulus, they were asked to perform one of three tasks in random order. The first task was structure-related, as participants had to decide whether they had heard a phrase or a sentence by pushing a button. The second and third task were meaning-related, as participants had to decide whether the colour or object of the spoken stimulus matched the picture that followed.
Also read: How ripples in our brains synchronize memories during sleep
As expected from computational simulations, the activation patterns of neurons in the brain were different for phrases and sentences, in terms of both timing and strength of neural connections. "Our findings show how the brain separates speech into linguistic structure by using the timing and connectivity of neural firing patterns. These signals from the brain provide a novel basis for future research on how our brains create language", says Martin.
"Additionally, the time-based mechanism could in principle be used for machine learning systems that interface with spoken language comprehension in order to represent abstract structure, something machine systems currently struggle with. We will conduct further studies on how knowledge of abstract structure and countable statistical information, like transitional probabilities between linguistic units, are used by the brain during spoken language comprehension."