玉美人传媒

Listening or lip-reading? It鈥檚 down to brainwaves

UNIGE researchers have discovered that neural oscillations determine whether the brain chooses eyes or ears to interpret speech.听

Megevand_BandeauWeb.jpg

Listening or lip-reading? Brainwaves are involved in this process.听听UNIGE/罢丑茅锄茅

To decipher what a person is telling us, we rely on what we hear as well as on what we see by observing lip movements and facial expressions. Until now, we did not know how the brain chooses between auditory and visual stimuli. But a research group of the 玉美人传媒 (UNIGE), funded by the Swiss National Science Foundation (SNSF) recently showed that neural oscillations (brainwaves) are involved in this process. More precisely, it is the phase of these waves 鈥 i.e. the point in the wave cycle just before a specific instant 鈥 that determines which sensory channel will contribute most to the perception of speech. The results of this study, led by neurologist Pierre M茅gevand of the 玉美人传媒, have just been published in the journal听Science Advances.听

In conducting their study, Pierre M茅gevand and his colleagues Rapha毛l 罢丑茅锄茅 and Anne-Lise Giraud used an innovative device based on audiovisual illusions. Subjects were placed in front of a screen on which a virtual character uttered phrases in French that could be misinterpreted, such as 鈥淚l n鈥檡 a rien 脿 boire / Il n鈥檡 a rien 脿 voir鈥 (鈥淭here鈥檚 nothing to drink / There鈥檚 nothing to see鈥 鈥 an example in English would be: 鈥淭he item was in the vase/base鈥). In some of the statements spoken by the virtual character, the researchers introduced a conflict between what the subjects saw and what they heard. For example, the character pronounced a 鈥渂鈥, but her lips formed a 鈥渧鈥. The subjects were asked to repeat the statement they had understood while electrodes recorded their brain鈥檚 electrical activity.

Audiovisual illusions

The researchers observed that when the auditory and visual information matched, the subjects repeated the correct statement most of the time. However, in the event of a conflict, subjects relied either on the auditory cue or the visual cue, depending. For example, when they heard a 鈥渧鈥 but saw a 鈥渂鈥, the auditory cue dominated perception in about two-thirds of cases. In the opposite situation, the visual cue guided perception.听

The sensory channel is determined in advance

The researchers compared these results with the brain鈥檚 electrical activity. They observed that about 300 milliseconds preceding agreement or conflict between the auditory and visual information, the phase of the brainwave in the posterior temporal and occipital cortex differed between subjects who had relied on the visual cue and those who had relied on the auditory cue.听

鈥淲e have known since the 1970s that in certain situations, the brain seems to choose visual cues over auditory cues, and even more so when the auditory signal is impeded, for example when there is ambient noise. We can now show that brainwaves are involved in this process. However, their exact role is still a mystery,鈥 says M茅gevand.听

4 Nov 2020

2020

Our experts

Talk to specialists from all disciplines

Discover and download UNIGE images

Archives

All our press releases since 2012

media(at)unige.ch

Université de Genève

24 rue Général-Dufour

CH-1211 Genève 4

T. +41 22 379 77 96

F. +41 22 379 77 29