SEARCH

SEARCH BY CITATION

Keywords:

  • sign language;
  • syntax;
  • fMRI;
  • deaf;
  • anterior temporal lobe;
  • superior temporal gyrus

Abstract

Studies of spoken and written language suggest that the perception of sentences engages the left anterior and posterior temporal cortex and the left inferior frontal gyrus to a greater extent than nonsententially structured material, such as word lists. This study sought to determine whether the same is true when the language is gestural and perceived visually. Regional neural activity was measured using functional MRI while Deaf and hearing native signers of British Sign Language (BSL) detected semantic anomalies in well-formed BSL sentences and when they detected nonsense signs in lists of unconnected BSL signs. Processing BSL sentences, when contrasted with signed lists, was reliably associated with greater activation in the posterior portions of the left middle and superior temporal gyri and in the left inferior frontal cortex, but not in the anterior temporal cortex, which was activated to a similar extent whether lists or sentences were processed. Further support for the specificity of these areas for processing the linguistic—rather than visuospatial—features of signed sentences came from a contrast of hearing native signers and hearing sign-naïve participants. Hearing signers recruited the left posterior temporal and inferior frontal regions during BSL sentence processing to a greater extent than hearing nonsigners. These data suggest that these left perisylvian regions are differentially associated with sentence processing, whatever the modality of the linguistic input. Hum Brain Mapp, 2005. © 2005 Wiley-Liss, Inc.