Giving speech a hand: Gesture modulates activity in auditory cortex during speech perception

Authors

  • Amy L. Hubbard,

    Corresponding author
    1. Ahmanson-Lovelace Brain Mapping Center, University of California, Los Angeles, California
    2. Department of Applied Linguistics, University of California, Los Angeles, California
    3. Computational Neuroscience Laboratories, ATR Kyoto, Japan
    • UCLA Department of Applied Linguistics, Ahmanson-Lovelace Brain Mapping Center, 660 Charles E. Young Drive South, Los Angeles, CA 90095-7085
    Search for more papers by this author
  • Stephen M. Wilson,

    1. Ahmanson-Lovelace Brain Mapping Center, University of California, Los Angeles, California
    Search for more papers by this author
  • Daniel E. Callan,

    1. Computational Neuroscience Laboratories, ATR Kyoto, Japan
    2. National Institute of Information and Communications Technology, ATR Kyoto, Japan
    Search for more papers by this author
  • Mirella Dapretto

    1. Ahmanson-Lovelace Brain Mapping Center, University of California, Los Angeles, California
    2. Department of Psychiatry and Biobehavioral Sciences, University of California, Los Angeles, California
    Search for more papers by this author

Abstract

Viewing hand gestures during face-to-face communication affects speech perception and comprehension. Despite the visible role played by gesture in social interactions, relatively little is known about how the brain integrates hand gestures with co-occurring speech. Here we used functional magnetic resonance imaging (fMRI) and an ecologically valid paradigm to investigate how beat gesture—a fundamental type of hand gesture that marks speech prosody—might impact speech perception at the neural level. Subjects underwent fMRI while listening to spontaneously-produced speech accompanied by beat gesture, nonsense hand movement, or a still body; as additional control conditions, subjects also viewed beat gesture, nonsense hand movement, or a still body all presented without speech. Validating behavioral evidence that gesture affects speech perception, bilateral nonprimary auditory cortex showed greater activity when speech was accompanied by beat gesture than when speech was presented alone. Further, the left superior temporal gyrus/sulcus showed stronger activity when speech was accompanied by beat gesture than when speech was accompanied by nonsense hand movement. Finally, the right planum temporale was identified as a putative multisensory integration site for beat gesture and speech (i.e., here activity in response to speech accompanied by beat gesture was greater than the summed responses to speech alone and beat gesture alone), indicating that this area may be pivotally involved in synthesizing the rhythmic aspects of both speech and gesture. Taken together, these findings suggest a common neural substrate for processing speech and gesture, likely reflecting their joint communicative role in social interactions. Hum Brain Mapp, 2009. © 2008 Wiley-Liss, Inc.

Ancillary