Event-related potentials reflect speech-relevant somatosensory-auditory interactions
An interaction between orofacial somatosensation and the perception of speech was demonstrated in recent psychophysical studies (Ito et al. 2009; Ito and Ostry 2009). To explore further the neural mechanisms of the speech-related somatosensory-auditory interaction, we assessed to what extent multisensory evoked potentials reflect multisensory interaction during speech perception. We also examined the dynamic modulation of multisensory integration resulting from relative timing differences between the onsets of the two sensory stimuli. We recorded event-related potentials from 64 scalp sites in response to somatosensory stimulation alone, auditory stimulation alone, and combined somatosensory and auditory stimulation. In the multisensory condition, the timing of the two stimuli was either simultaneous or offset by 90 ms (lead and lag). We found evidence of multisensory interaction with the amplitude of the multisensory evoked potential reliably different than the sum of the two unisensory potentials around the first peak of multisensory response (100–200 ms). The magnitude of the evoked potential difference varied as a function of the relative timing between the stimuli in the interval from 170 to 200 ms following somatosensory stimulation. The results demonstrate clear multisensory convergence and suggest a dynamic modulation of multisensory interaction during speech.
ISSN: 2041-6695 (electronic only)
Copyright: Copyright is retained by the author(s) of this article. This open-access article is distributed under a Creative Commons Licence, which permits noncommercial use, distribution, and reproduction, provided the original author(s) and source are credited and no alterations are made.
Upload a poster