The precise mechanisms of how speech may have developed are still unknown to a large extent. Gestures have proven a powerful concept for explaining how planning and analysing of motor acts could have evolved into verbal communication. According to this concept, development of an action-perception network allowed for coding and decoding of communicative gestures. These were manual or manual/articulatory in the beginning and then became increasingly elaborate in the articulatory mode. The theory predicts that listening to the ‘gestures’ that compose spoken language should activate an extended articulatory and manual action-perception network. To examine this hypothesis, we assessed the effects of language on cortical excitability of the hand muscle representation by transcranial magnetic stimulation. We found the hand motor system to be activated by linguistic tasks, most notably pure linguistic perception, but not by auditory or visuospatial processing. The amount of motor system activation was comparable in both hemispheres. Our data support the theory that language may have evolved within a general and bilateral action-perception network.