Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech
Loading...
Can’t use the file because of accessibility barriers? Contact us with the title of the item, permanent link, and specifics of your accommodation need.
Date
2012-02
Journal Title
Journal ISSN
Volume Title
Publisher
Brain Topography
Permanent Link
Abstract
In recent years, it has become evident that neural responses previously considered to be unisensory can be modulated by sensory input from other modalities. In this regard, visual neural activity elicited to viewing a face is strongly influenced by concurrent incoming auditory information, particularly speech. Here, we applied an additive-factors paradigm aimed at quantifying the impact that auditory speech has on visual event-related potentials (ERPs) elicited to visual speech. These multisensory interactions were measured across parametrically varied stimulus salience, quantified in terms of signal to noise, to provide novel insights into the neural mechanisms of audiovisual speech perception. First, we measured a monotonic increase of the amplitude of the visual P1-N1-P2 ERP complex during a spoken-word recognition task with increases in stimulus salience. ERP component amplitudes varied directly with stimulus salience for visual, audiovisual, and summed unisensory recordings. Second, we measured changes in multisensory gain across salience levels. During audiovisual speech, the P1 and P1-N1 components exhibited less multisensory gain relative to the summed unisensory components with reduced salience, while N1-P2 amplitude exhibited greater multisensory gain as salience was reduced, consistent with the principle of inverse effectiveness. The amplitude interactions were correlated with behavioral measures of multisensory gain across salience levels as measured by response times, suggesting that change in multisensory gain associated with unisensory salience modulations reflects an increased efficiency of visual speech processing.
Description
Posptrint, author's accepted manuscript
Keywords
Multisensory integration, Inverse effectiveness, ERPs, P1-N1-P2, N170, Speech perception, Face perception
Citation
Stevenson RA, Bushmakin M, Kim S, Puce A, James TW. (2012) Inverse effectiveness and multisensory interactions in visual event-related potentials with audiovisual speech. Brain Topography 25: 308-326.
Journal
Link(s) to data and video for this item
Relation
Rights
Type
Article