To access the full text documents, please follow this link: http://hdl.handle.net/10230/27084

Top-down attention regulates the neural expression of audiovisual integration
Morís Fernández, Luis, 1982-; Visser, Maya; Ventura Campos, Noelia; Ávila, César; Soto-Faraco, Salvador, 1970-
The interplay between attention and multisensory integration has proven to be a difficult question to tackle./nThere are almost as many studies showing that multisensory integration occurs independently from the focus/nof attention as studies implying that attention has a profound effect on integration. Addressing the neural/nexpression of multisensory integration for attended vs. unattended stimuli can help disentangle this apparent/ncontradiction. In the present study, we examine if selective attention to sound pitch influences the expression/nof audiovisual integration in both behavior and neural activity. Participants were asked to attend to one of two/nauditory speech streams while watching a pair of talking lips that could be congruent or incongruent with the/nattended speech stream. We measured behavioral and neural responses (fMRI) to multisensory stimuli under/nattended and unattended conditions while physical stimulation was kept constant. Our results indicate that participants/nrecognized words more accurately from an auditory stream that was both attended and audiovisually/n(AV) congruent, thus reflecting a benefit due to AV integration. On the other hand, no enhancement was found/nfor AV congruency when it was unattended. Furthermore, the fMRI results indicated that activity in the superior/ntemporal sulcus (an area known to be related to multisensory integration) was contingent on attention as well as/non audiovisual congruency. This attentional modulation extended beyond heteromodal areas to affect processing/nin areas classically recognized as unisensory, such as the superior temporal gyrus or the extrastriate cortex, and to/nnon-sensory areas such as the motor cortex. Interestingly, attention to audiovisual incongruence triggered responses/nin brain areas related to conflict processing (i.e., the anterior cingulate cortex and the anterior insula)./nBased on these results, we hypothesize that AV speech integration can take place automatically only when/nboth modalities are sufficiently processed, and that if a mismatch is detected between the AV modalities,/nfeedback from conflict areas minimizes the influence of this mismatch by reducing the processing of the least/ninformative modality.
This research was supported by the Ministerio de Economía y/nCompetitividad (PSI2013-42626-P), AGAUR Generalitat de Catalunya/n(2014SGR856), and the European Research Council (StG-2010 263145).
-Multisensory
-Audiovisual
-Attention
-Speech perception
-fMRI
-STS
© Elsevier http://dx.doi.org/10.1016/j.neuroimage.2015.06.052
Article
Article - Accepted version
Elsevier
         

Show full item record

Related documents

Other documents of the same author

Barrós Loscertales, Alfonso; Ventura Campos, Noelia; Visser, Maya; Alsius, Agnès; Pallier, Christophe; Ávila, César; Soto-Faraco, Salvador, 1970-
Biau, Emmanuel, 1985-; Morís Fernández, Luis, 1982-; Holle, Henning; Ávila, César; Soto-Faraco, Salvador, 1970-
Rodríguez Pujadas, Aina; Sanjuán, Ana; Ventura Campos, Noelia; Román, Patricia; Martin, Clara D.; Barceló-Arroyo, Francisco; Costa, Albert, 1970-; Ávila, César
Ventura Campos, Noelia; Sanjuán, Ana; González, Julio; Palomar García, María Ángeles; Rodríguez Pujadas, Aina; Sebastián Gallés, Núria; Ávila, César; Deco, Gustavo
Burgaleta Díaz, Miguel, 1981-; Sanjuán, Ana; Ventura Campos, Noelia; Ávila, César; Sebastián Gallés, Núria
 

Coordination

 

Supporters