Audiovisual attention boosts letter-speech sound integration

Authors

  • Maria Mittag,

    Corresponding author
    • Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, University of Helsinki, Helsinki, Finland
    Search for more papers by this author
  • Kimmo Alho,

    1. Division of Cognitive Psychology and Neuropsychology, Institute of Behavioural Sciences, University of Helsinki, Helsinki, Finland
    2. Helsinki Collegium for Advanced Studies, University of Helsinki, Helsinki, Finland
    Search for more papers by this author
  • Rika Takegata,

    1. Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, University of Helsinki, Helsinki, Finland
    Search for more papers by this author
  • Tommi Makkonen,

    1. Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, University of Helsinki, Helsinki, Finland
    2. Finnish Centre of Excellence in Interdisciplinary Music Research, University of Jyväskylä, Jyväskylä, Finland
    Search for more papers by this author
  • Teija Kujala

    1. Cognitive Brain Research Unit, Cognitive Science, Institute of Behavioural Sciences, University of Helsinki, Helsinki, Finland
    2. Cicero Learning Network, University of Helsinki, Helsinki, Finland
    Search for more papers by this author

  • The present study was supported by the National Doctoral Programme of Psychology (University of Turku), the Academy of Finland (Grant 128840), and the Research Funds of the University of Helsinki. We wish to thank Jari Lipsanen, Saila Seppänen, and Valtteri Wikström for their assistance during the various stages of this project, as well as the two anonymous reviewers for their valuable contributions.

Address correspondence to: Maria Mittag, Cognitive Brain Research Unit, PO Box 9, (Siltavuorenpenger 1B), 00014 University of Helsinki, Finland. E-mail: maria.mittag@helsinki.fi

Abstract

We studied attention effects on the integration of written and spoken syllables in fluent adult readers by using event-related brain potentials. Auditory consonant-vowel syllables, including consonant and frequency changes, were presented in synchrony with written syllables or their scrambled images. Participants responded to longer-duration auditory targets (auditory attention), longer-duration visual targets (visual attention), longer-duration auditory and visual targets (audiovisual attention), or counted backwards mentally. We found larger negative responses for spoken consonant changes when they were accompanied by written syllables than when they were accompanied by scrambled text. This effect occurred at an early latency (∼ 140 ms) during audiovisual attention and later (∼ 200 ms) during visual attention. Thus, audiovisual attention boosts the integration of speech sounds and letters.

Ancillary