Humans are endowed with a motor system that resonates to speech sounds, but whether concurrent visual information from lip movements can improve speech perception at a motor level through multisensory integration mechanisms remains unknown. Therefore, the aim of the study was to explore behavioral and neurophysiological correlates of multisensory influences on motor resonance in speech perception. Motor-evoked potentials (MEPs), by single pulse transcranial magnetic stimulation (TMS) applied over the left lip muscle (orbicularis oris) representation in the primary motor cortex, were recorded in healthy participants during the presentation of syllables in unimodal (visual or auditory) or multisensory (audio-visual) congruent or incongruent conditions. At the behavioral level, subjects showed better syllable identification in the congruent audio-visual condition as compared to the unimodal conditions, hence showing a multisensory enhancement effect. Accordingly, at the neurophysiological level, increased MEPs amplitudes were found in the congruent audio-visual condition, as compared to the unimodal ones. Incongruent audio-visual syllables resulting in illusory percepts did not increase corticospinal excitability, which in fact was comparable to that induced by the real perception of the same syllable. In conclusion, seeing and hearing congruent bilabial syllables increases the excitability of the lip representation in the primary motor cortex, hence documenting that multisensory integration can facilitate speech processing by influencing motor resonance. These findings highlight the modulation role of multisensory processing showing that it can boost speech perception and that multisensory interactions occur not only within higher-order regions, but also within primary motor areas, as shown by corticospinal excitability changes.
Giurgola, S., Lo Gerfo, E., Farnè, A., Roy, A., Bolognini, N. (2024). Multisensory integration and motor resonance in the primary motor cortex. CORTEX, 179(October 2024), 235-246 [10.1016/j.cortex.2024.07.015].
Multisensory integration and motor resonance in the primary motor cortex
Giurgola S.
Co-primo
;Bolognini N.
Ultimo
2024
Abstract
Humans are endowed with a motor system that resonates to speech sounds, but whether concurrent visual information from lip movements can improve speech perception at a motor level through multisensory integration mechanisms remains unknown. Therefore, the aim of the study was to explore behavioral and neurophysiological correlates of multisensory influences on motor resonance in speech perception. Motor-evoked potentials (MEPs), by single pulse transcranial magnetic stimulation (TMS) applied over the left lip muscle (orbicularis oris) representation in the primary motor cortex, were recorded in healthy participants during the presentation of syllables in unimodal (visual or auditory) or multisensory (audio-visual) congruent or incongruent conditions. At the behavioral level, subjects showed better syllable identification in the congruent audio-visual condition as compared to the unimodal conditions, hence showing a multisensory enhancement effect. Accordingly, at the neurophysiological level, increased MEPs amplitudes were found in the congruent audio-visual condition, as compared to the unimodal ones. Incongruent audio-visual syllables resulting in illusory percepts did not increase corticospinal excitability, which in fact was comparable to that induced by the real perception of the same syllable. In conclusion, seeing and hearing congruent bilabial syllables increases the excitability of the lip representation in the primary motor cortex, hence documenting that multisensory integration can facilitate speech processing by influencing motor resonance. These findings highlight the modulation role of multisensory processing showing that it can boost speech perception and that multisensory interactions occur not only within higher-order regions, but also within primary motor areas, as shown by corticospinal excitability changes.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.