Get access

An event-related FMRI study of exogenous orienting across vision and audition

Authors

  • Zhen Yang,

    1. The Mind Research Network/Lovelace Biomedical and Environmental Research Institute, Albuquerque, New Mexico 87106
    Search for more papers by this author
  • Andrew R. Mayer

    Corresponding author
    1. The Mind Research Network/Lovelace Biomedical and Environmental Research Institute, Albuquerque, New Mexico 87106
    2. Department of Psychology, University of New Mexico, Albuquerque, New Mexico 87131
    3. Department of Neurology, University of New Mexico School of Medicine, Albuquerque, New Mexico 87131
    • The Mind Research Network, Pete & Nancy Domenici Hall, 1101 Yale Blvd. NE, Albuquerque, New Mexico 87106. E-mail: amayer@mrn.org

    Search for more papers by this author

Abstract

The orienting of attention to the spatial location of sensory stimuli in one modality based on sensory stimuli presented in another modality (i.e., cross-modal orienting) is a common mechanism for controlling attentional shifts. The neuronal mechanisms of top-down cross-modal orienting have been studied extensively. However, the neuronal substrates of bottom-up audio-visual cross-modal spatial orienting remain to be elucidated. Therefore, behavioral and event-related functional magnetic resonance imaging (FMRI) data were collected while healthy volunteers (N = 26) performed a spatial cross-modal localization task modeled after the Posner cuing paradigm. Behavioral results indicated that although both visual and auditory cues were effective in producing bottom-up shifts of cross-modal spatial attention, reorienting effects were greater for the visual cues condition. Statistically significant evidence of inhibition of return was not observed for either condition. Functional results also indicated that visual cues with auditory targets resulted in greater activation within ventral and dorsal frontoparietal attention networks, visual and auditory “where” streams, primary auditory cortex, and thalamus during reorienting across both short and long stimulus onset asynchronys. In contrast, no areas of unique activation were associated with reorienting following auditory cues with visual targets. In summary, current results question whether audio-visual cross-modal orienting is supramodal in nature, suggesting rather that the initial modality of cue presentation heavily influences both behavioral and functional results. In the context of localization tasks, reorienting effects accompanied by the activation of the frontoparietal reorienting network are more robust for visual cues with auditory targets than for auditory cues with visual targets. Hum Brain Mapp 35:964–974, 2014. © 2013 Wiley Periodicals, Inc.

Ancillary