SEARCH

SEARCH BY CITATION

Keywords:

  • body posture;
  • body representation;
  • event-related potentials;
  • multisensory representation;
  • peripersonal space;
  • somatosensory evoked potentials

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Experiment 1
  5. Experiment 2
  6. General discussion
  7. Summary and conclusion
  8. Acknowledgements
  9. References

We investigated the electrophysiological correlates of somatosensory processing under different arm postures by recording event-related potentials at frontal, central and centroparietal sites during tactile stimulation of the hands. Short series of 200 ms vibrotactile stimuli were presented to the palms of the participants' hands, one hand at a time, in either uncrossed- or crossed-hands postures. The manipulation of posture allowed us to investigate the electrophysiological processes underlying the spatial remapping of somatosensory stimuli from anatomical into external frames of reference. To examine somatosensory spatial remapping independently of its effects on attentional processes, the stimuli were presented unpredictably in terms of location, and in temporal onset. We also examined how vision of the limbs affects the process of remapping. When participants had sight of their hands (Experiment 1) the effect of posture was observed over regions contralateral to the stimulated hand from 128 ms, whereas when their limbs were covered (Experiment 2) effects of posture influenced the ipsilateral regions from 150 ms. These findings add to an increasing body of evidence which indicates that sight of the hand modulates the way in which information in other modalities is processed. We argue that in this case, sight of the hand biases spatial encoding of touch towards an anatomical frame of reference.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Experiment 1
  5. Experiment 2
  6. General discussion
  7. Summary and conclusion
  8. Acknowledgements
  9. References

Localizing a touch on the body is a two-stage process, in which the stimulus is first localised on the body, and then mapped onto a corresponding location in external space by taking account of the layout of the limbs (Longo et al., 2010). Changes in body posture have an impact on this process as, when our limbs move, the relationship between tactile and external space changes. To locate a tactile stimulus in external space, a remapping of somatosensory space according to current posture is required. Behavioural evidence has shown that, in a crossed-hands posture, a touch is mapped to a location in external space by, at the latest, 180 ms after stimulus application (Azañón & Soto-Faraco, 2008). However, despite considerable neuroscientific research on the processes underlying somatosensory spatial representation (e.g. Maravita et al., 2003; Graziano et al., 2004; Làdavas & Farnè, 2004; Spence et al., 2004), and evidence from transcranial magnetic stimulation studies for the causal role of posterior parietal cortex in remapping (Azañón et al., 2010), no research has yet examined the electrophysiological time course of remapping in the human brain.

Several researchers have used somatosensory evoked potentials (SEPs) to investigate how posture affects the processes involved in voluntarily attending to stimuli arising in peripersonal space (e.g. Eimer et al., 2003; Heed & Röder, 2010). These studies show that posture modulates effects of attention early in processing, around 100–140 ms after somatosensory stimulation. However, the extent to which these studies tell us about how representations of somatosensory space per se are remapped (as opposed to voluntary attention to somatosensory locations) is uncertain. It is possible that the processing of touch occurs according to different neural spatial representational formats and time courses, depending on whether the touch is to be the target of an overt or covert orienting response. The current study investigates the neural processing of tactile stimuli with a specific goal of tracking the time course over which somatosensory processing is modulated by postural remapping. To exclude effects of expectation, and thus voluntary attention, we present tactile stimuli in a task-irrelevant and unpredictable fashion.

Both proprioceptive and visual signals concerning the limbs, alone or in combination, play important roles in postural remapping (see Medina & Coslett, 2010). Studies of multisensory neurons in primate premotor cortex have shown that cells remap their visual receptive fields according to the position of the arm given by proprioception alone, and also when posture is indicated by sight of a fake arm which conflicts with proprioception (Graziano, 1999). Imaging studies and behavioural data from intact and brain-damaged individuals have also indicated that human adults use both visual and proprioceptive cues to hand position in remapping tactile space (e.g. Làdavas, 2002; Lloyd et al., 2003; Azañón & Soto-Faraco, 2008). Nonetheless, it appears that visual cues to hand position exert a greater weight on remapping somatosensory space than does proprioceptive information (Graziano, 1999; Làdavas & Farnè, 2004).

Here, we report two event-related potential (ERP) experiments which investigate the time course of postural remapping of somatosensory space. Based on Azañón & Soto-Faraco (2008), remapping of touch to a location in external space was anticipated to occur after early processing stages (i.e. after primary somatosensory cortex) and therefore possibly affecting the N140 time-window. In both experiments, vibrotactile stimuli were presented individually and unpredictably to the palms of the participants' hands in quick succession, in uncrossed- and crossed-hands postures. To determine whether the onset of postural remapping differs according to the perceptual information about posture which is available, Experiment 1 provided participants with both visual and proprioceptive cues to posture, whereas Experiment 2 provided only proprioceptive cues to posture (the participants' arms and hands were obscured from view by a black cloth and a second table top) (see Fig. 1).

image

Figure 1. Diagram of the experimental set-up. Participants placed their hands on a table in front of them and were asked to uncross and cross their arms. In Experiment 2, the visual information about hands and arms posture was eliminated by covering the limbs and placing the hands under a second table top.

Download figure to PowerPoint

Experiment 1

  1. Top of page
  2. Abstract
  3. Introduction
  4. Experiment 1
  5. Experiment 2
  6. General discussion
  7. Summary and conclusion
  8. Acknowledgements
  9. References

Method

Participants

Twelve adults (five males), aged between 20 and 40 years (mean 28 years), volunteered in Experiment 1. All the participants were right-handed, and had normal or corrected-to-normal vision by self-report. Informed consent was obtained from the participants. Ethical approval for both experiments was gained from the Research Ethics Committee of Goldsmiths, University of London, and the Research Ethics Committee of the Department of Psychological Sciences, Birkbeck, University of London. The studies conform to The Code of Ethics of the World Medical Association (Declaration of Helsinki; British Medical Journal, 18 July 1964).

Stimuli

Participants sat at a table within an acoustically and electrically shielded room that was lit dimly. ERPs were recorded while participants were presented with vibrotactile stimuli to the palm of their hands in quick succession. Vibrotactile stimulation was presented via bone-conducting hearing aids (‘Tactaids’; Audiological Engineering, Somerville, MA, USA), driven at 220 Hz by a sine wave generator and amplifier. The participants held these devices completely enclosed inside closed palms. This prevented the very minimal sound which they produced from being audible. Each trial consisted of six vibrotactile stimuli presented to one hand at a time in random order. Each stimulus was delivered to the palm of one hand for 200 ms, with interstimulus intervals varying randomly between 800 and 1400 ms. There were 40 trials per posture condition (uncrossed-hands and crossed-hands), i.e. 480 stimuli in total.

Procedure

The participants were asked to hold the tactile stimulators in their palms and keep their hands closed with their palms down throughout the experimental session. They were also asked to gaze straight ahead to a fixation cross to avoid eye-movements and also to blink as little as they could. Their hands were placed on a table in front of them and the distance between the ring fingers of each hand was kept constant at 30 cm (Fig. 1). Throughout the experiment, participants were asked to alternately cross or uncross their arms after each trial (each trial consisted of a train of six stimuli; see above). Half of them were asked to cross the midline moving the right hand over the left, while the other half was asked to cross the left hand over the right. Two video cameras were mounted in the experimental room to monitor the participants' eye-blinks and eye-movements, and to ensure that they adopted and maintained the instructed hand positions.

EEG recording and analysis

Brain electrical activity was recorded continuously by using a Hydrocel Geodesic Sensor Net, consisting of 128 silver–silver chloride electrodes evenly distributed across the scalp (Fig. 2). The vertex served as the reference. The electrical potential was amplified with 0.1–100 Hz band-pass, digitized at a 500 Hz sampling rate, and stored on a computer disk for offline analysis. The data were analysed using NetStation 4.2 analysis software (Electrical Geodesics Inc., Eugene, OR, USA). Continuous EEG data were low-pass filtered at 30 Hz using digital elliptical filtering, and segmented in epochs from 100 ms before until 700 ms after stimulus onset. Segments with eye-movements and blinks were detected visually and rejected from further analysis. Artefact-free data were then baseline-corrected to the average amplitude of the 100 ms interval preceding stimulus onset, and re-referenced to the average potential over the scalp. Finally, individual and grand averages were calculated.

image

Figure 2. Hydrocel Geodesic Sensor Net. The frontal (F3/F4), central (C3/C4) and centroparietal (CP5/CP6) electrodes included in the analyses are highlighted. Solid lines indicate the frontal electrodes; dotted lines indicate the central electrodes; dashed lines indicate the centroparietal electrodes.

Download figure to PowerPoint

Statistical analyses of the ERP data focused on sites close to somatosensory areas (Frontal sites, F3 and F4: 20, 24, 28, 117, 118, 124; Central sites, C3 and C4: 35, 36, 41, 103, 104, 110; Centroparietal sites, CP5 and CP6: 47, 52, 53, 86, 92, 98; see Fig. 2; see, for example, Eimer & Forster, 2003). SEPs at these sites were observed to be the largest across both of the experiments and showed the typical pattern of somatosensory components in response to tactile stimuli (P45, N80, P100 and N140).

For each participant, we calculated the difference waveform between posture conditions for ERPs contralateral and ipsilateral to the stimulated hand. To establish the precise onset of the effects of remapping on somatosensory processing, a sample-point by sample-point analysis was carried out to determine whether the difference waveform deviated reliably from zero. Based on previous evidence suggesting that postural remapping is apparent in behaviour within 180 ms (Azañón & Soto-Faraco, 2008) we sampled across the first 200 ms following stimulus onset. This analysis corrected for the autocorrelation of consecutive sample-points by using a Monte Carlo simulation method based on Guthrie & Buchwald (1991). This method began by estimating the average first-order autocorrelation present in the real difference waveforms across the temporal window noted above. Next, 1000 datasets of randomly generated waveforms were simulated, each waveform having zero mean and unit variance at each time point, but having the same level of autocorrelation as seen on average in the observed data. Each simulated dataset also had the same number of participants and time-samples as in the real data. Two-tailed one-sample t-tests (vs. zero; α = 0.05, uncorrected) were applied to the simulated data at each simulated timepoint, recording significant vs. non-significant outcomes. In each of the 1000 simulations the longest sequence of consecutive significant t-test outcomes was computed. The 95th percentile of that simulated distribution of ‘longest sequence lengths’ was then used to determine a significant difference waveform in the real data; specifically, we noted any sequences of significant t-tests in our real data which exceeded this 95th percentile value. This method thus avoids the difficulties associated with multiple comparisons and preserves the type 1 error rate at 0.05 for each difference waveform analysed.

In addition to this sample-point analysis, ERP mean amplitudes were computed within time-windows around early somatosensory ERP components. The latencies of peak amplitudes were determined for each individual participant by visual inspection, and time windows were then chosen to include the temporal spread of peaks across participants. This resulted in the following windows for analysis: P45 (45–65 ms), N80 (65–105 ms), P100 (105–130 ms) and N140 (130–180 ms). Mean amplitudes were also computed for the time-window between 180 and 400 ms to investigate longer-latency effects. The mean amplitudes were explored with a 3 × 2 × 2 repeated-measures anova for the factors: (i) Electrode Site (C3/C4 vs. F3/F4 vs. CP5/CP6), (ii) Hemisphere (ipsilateral vs. contralateral hemisphere to the stimulated hand) and (iii) Posture (uncrossed vs. crossed). In our analyses, we focused on the comparison between crossed and uncrossed postures and the hemispheric distribution of this effect, as expressed by a Hemisphere by Posture interaction. Planned comparisons (with a Bonferroni correction) between uncrossed- and crossed-hands were performed separately for the contralateral and ipsilateral hemispheres to explore the effects of Posture.

Results

Figure 3 shows the grand average of the SEPs obtained in Experiment 1 (in which participants had sight of their hands) for frontal, central and centroparietal sites (contralateral and ipsilateral to the stimulated hand). Figure 4 presents the grand average collapsed across frontal, central and centroparietal sites (contralateral and ipsilateral to the stimulated hand) together with a difference waveform obtained by subtracting the SEP waveform in the uncrossed-hands posture from that in the crossed-hands posture. Sample-point by sample-point analysis was carried out on the data for the first 200 ms following stimulus onset. The vertical dashed line in Fig. 4 indicates the onset of the intervals during which the difference waves deviate significantly from zero, and thus reveals the onset of statistically reliable effects of posture on somatosensory processing. At contralateral sites, significant effects of Posture (all < 0.05, uncorrected) were observed from 128 to 166 ms (a sequence of consecutive significant t-tests over 36 ms in length was deemed significant by our Monte Carlo simulation). At ipsilateral sites, Posture effects were not found within the time-window selected. Thus, the earliest influence of postural remapping on somatosensory processing appears to occur at 128 ms over the contralateral hemisphere. The mean first-order autocorrelation at lag 1 (estimated from our data, and used for our Monte Carlo simulations) was 0.98 for the contralateral and 0.98 for the ipsilateral dataset.

image

Figure 3. Grand average SEPs at frontal, central and centroparietal sites for Experiment 1 (left panel) in which participants had sight of their hands, and Experiment 2 (right panel) in which participants' hands were covered. Solid lines indicate the condition in which the hands were in a crossed-hands posture, and dashed lines indicate the condition in which the hands were uncrossed.

Download figure to PowerPoint

image

Figure 4. Grand average SEPs pooled across the frontal, central and centroparietal clusters of electrodes included in the analyses from contralateral and ipsilateral sites to the stimulated hand obtained in both Experiment 1 (in which participants had sight of their hands) and Experiment 2 (in which participants had no sight of their hands). SEPs are compared across uncrossed-hands and crossed-hands posture conditions. Alongside the SEPs, a difference waveform, obtained by subtracting the SEP waveform in uncrossed-hands posture from that in crossed-hands posture is shown (UnX-X). The vertical dashed lines indicate the onset of statistically reliable effects of posture on somatosensory processing, i.e. at 128 and 150 ms for Experiments 1 and 2, respectively.

Download figure to PowerPoint

Statistical analyses of the mean amplitudes are compatible with these observations. In the P45 time-window, the overall analyses including Electrode Site, Hemisphere and Posture showed main effects of Electrode Site (F2,22 = 33.964, < 0.01) and Hemisphere (F1,11 = 30.047, < 0.01). An interaction of Electrode Site × Hemisphere was also found (F2,22 = 50.254, < 0.01).

In the N80 time-window, a main effect of Electrode Site was obtained (F2,22 = 50.352, < 0.01), together with an interaction of Electrode Site × Hemisphere (F2,22 = 18.902, < 0.01).

Main effects of Electrode Site (F2,22 = 32.807, < 0.01) and Hemisphere (F1,11 = 25.231, < 0.01), and an interaction of Electrode Site × Hemisphere (F2,22 = 4.689, = 0.02) were also found in the P100 time-window.

In the N140 time-window, main effects of Electrode Site (F2,22 = 31.764, < 0.01) and Hemisphere (F1,11 = 43.445, < 0.01) were obtained. The first effect of Posture was also found at the N140 (F1,11 = 8.682, = 0.013) according to which crossing the arms enhanced the N140 amplitude (uncrossed – M = −0.64 μV, crossed – M = −0.79 μV). An interaction of Electrode Site × Hemisphere (F2,22 = 6.809, < 0.01), and a marginal interaction of Posture × Hemisphere (F1,11 = 4.263, = 0.06) were also observed at the N140. Planned comparisons (Bonferroni-corrected using = 0.025) showed that the contralateral N140 was enhanced for crossed-hands posture in comparison with uncrossed-hands (t11 = 2.791, = 0.018; crossed – M = −1.1 μV; uncrossed – M = −0.85 μV). This effect was not found for the ipsilateral N140 (t11 = 0.596, n.s.). The more contralateral distribution of the crossing effect can also be seen in Fig. 5, which shows the topographical maps of the voltage distribution over the scalp.

image

Figure 5. Topographical representations of the voltage distribution over the scalp averaged for the sites contralateral and ipsilateral to the stimulated hand in both Experiment 1 and Experiment 2. The difference wave maps represent the posture effect at the 140–160 ms time-window. Positive voltage values are plotted in red, and negative voltage values are plotted in blue.

Download figure to PowerPoint

In the time-window between 180 and 400 ms post-stimulus, the anova computed to investigate longer latency effects showed a main effect of Hemisphere (F1,11 = 7.585, = 0.019; contralateral – = 0.12 μV; ipsilateral – M = −0.09 μV) and of Posture (F1,11 = 9.462, = 0.011) (uncrossed – = 0.09 μV; crossed – M = −0.06 μV). An interaction of Electrode Site × Hemisphere was also obtained (F2,22 = 6.809, < 0.01).

Discussion

The participants in Experiment 1 were presented with tactile stimuli to their hands across blocks in which they were asked to adopt either crossed-hands or uncrossed-hands postures. Analyses of SEPs recorded from central, centroparietal and frontal sites indicated that posture affected somatosensory processing from 128 ms over the contralateral hemisphere. Posture effects were not observed over the ipsilateral hemisphere. Effects of posture on specifically contralateral somatosensory activity were also identified in Lloyd et al.'s (2003) functional magnetic resonance imaging (fMRI) investigation of limb position representations. Interestingly, Lloyd and her colleagues found that posture-related somatosensory activity shifted to ipsilateral regions when participants had their eyes closed. They interpreted this hemispheric shift as suggesting that whereas proprioceptive cues to hand position are sufficient to permit remapping of tactile stimuli to external coordinates (i.e. coordinates in a frame of reference which is not fixed with respect to anatomical or somatotopic locations), visual cues about the hand bias participants to encode tactile stimuli with respect to an anatomical frame of reference. In Experiment 2, we covered participants' hands during tactile stimulation and examined whether a similar hemispheric shift in posture effects on somatosensory processing from contralateral to ipsilateral sites can also be observed in SEPs.

Experiment 2

  1. Top of page
  2. Abstract
  3. Introduction
  4. Experiment 1
  5. Experiment 2
  6. General discussion
  7. Summary and conclusion
  8. Acknowledgements
  9. References

Method

Participants

Twelve adults (five males), aged between 21 and 31 years (mean 26 years), volunteered in Experiment 2 (in which participants had no sight of their hands). None had participated in Experiment 1. All of the participants were right-handed, and had normal or corrected-to-normal vision by self-report. Informed consent was obtained from the participants.

Procedure

The stimuli and procedure were the same as in Experiment 1. The only difference was that, in this experiment, visual information about the hands, the arms and their postures was eliminated by placing a second table-top over the participants' hands. In addition, the upper arms were covered by a black cloth that was attached to the second table-top (see Fig. 1).

EEG recording and analysis

The same electrode sites were used as in Experiment 1. As in Experiment 1, we calculated a difference waveform between posture conditions for ERPs contralateral and ipsilateral to the stimulated hand, and employed a Monte Carlo simulation method to establish the precise onset (across successive sample points) of the effects of remapping on somatosensory processing. ERP mean amplitudes were again computed within successive time-windows. As in Experiment 1, the latencies of individual participants' peak amplitudes were determined and used to define the appropriate component time windows. These were 45–65 ms for the P45 and 65–105 ms for the N80. In this experiment, no separate component peaks could be distinguished for the P100 and N140. Therefore, a time-window between 105 and 180 ms was chosen to capture this ‘P100–N140 complex’. Again, mean amplitudes were also computed for the time-window between 180 and 400 ms to investigate longer-latency effects.

In our analyses of the ERP mean amplitudes, we again focused on the comparison between crossed and uncrossed postures and the hemispheric distribution of this effect, as expressed by a Hemisphere by Posture interaction. The same analytical plan as used in Experiment 1 was not possible in Experiment 2, due to an unpredicted three-way interaction between Hemisphere, Posture and Electrode Site on the P100–N140 complex. Therefore, this was explored using three additional post hoc Bonferroni-corrected simple interaction effects analyses, conducted at each of the Electrode Site levels to clarify why the distribution of the Hemisphere by Posture effect varied statistically over the scalp. At electrode sites with significant simple Hemisphere by Posture interactions, further simple posture effects analyses were performed (i.e. for each hemisphere separately at that electrode site).

Results

Figure 3 shows the grand average of the SEPs obtained in Experiment 2 (in which participants did not have sight of their hands) for frontal, central and centroparietal sites (contralateral and ipsilateral to the stimulated hand). Figure 4 presents the grand average collapsed across frontal, central and centroparietal sites (contralateral and ipsilateral to the stimulated hand) together with a difference waveform obtained by subtracting the SEP waveform in the uncrossed-hands posture from that in the crossed-hands posture. We again conducted a sample-point by sample-point analysis for the first 200 ms after stimulus onset. The vertical dashed line in Figure 4 indicates the onset of the intervals during which the difference waves deviate significantly from zero, and thus reveals the onset of statistically reliable effects of posture on somatosensory processing (< 0.05). At ipsilateral sites this effect started at 150 ms and was observed until the end of the interval tested, i.e. 200 ms (a sequence of consecutive significant t-tests over 34 ms in length was deemed significant by our Monte Carlo simulation). No effects were observed for the contralateral difference waveform. The mean first-order autocorrelation at lag 1 (estimated in our data, and used for our Monte Carlo simulations) was 0.99 for the contralateral dataset and 0.98 for the ipsilateral dataset.

Again, these findings are compatible with the results of an analysis of mean amplitudes which were entered into a 3 × 2 × 2 repeated-measures anova for the factors: (i) Electrode Site (C3/C4 vs. F3/F4 vs. CP5/CP6), (ii) Hemisphere (ipsilateral vs. contralateral hemisphere to the stimulated hand) and (iii) Posture (uncrossed vs. crossed).

For the P45 time-window, main effects of Electrode Site (F2,22 = 100.042, < 0.01) and Hemisphere (F1,11 = 31.582, < 0.01) were obtained. An interaction of Electrode Site × Hemisphere was also found (F2,22 = 72.794, < 0.01).

The N80 time-window was affected by a main effect of Electrode Site (F2,22 = 18.874, < 0.01) and by an interaction of Electrode Site × Hemisphere (F2,22 = 21.264, < 0.01).

For the P100–N140 complex, a main effect of Electrode Site (F2,22 = 38.613, < 0.01), and an interaction of Electrode Site × Hemisphere was obtained (F2,22 = 5.649, = 0.030). The P100–N140 complex was also modulated by a three-way interaction of Electrode Site × Hemisphere × Posture (F2,22 = 8.263, < 0.01). To explore the distribution of the Posture effects over the scalp, this unpredicted three-way interaction was unpacked post hoc by testing the simple Hemisphere by Posture interactions within each of the Electrode Site levels (tested against a Bonferroni-corrected alpha level of 0.017). These revealed a significant interaction of Hemisphere × Posture at the frontal sites only (F1,11 = 11.230, < 0.01). Further simple Posture effects analyses (on the data from frontal sites only; between uncrossed- and crossed-hands posture conditions) were performed separately for contralateral and ipsilateral hemispheres. These showed that the frontal P100–N140 complex at ipsilateral sites was enhanced for the crossed compared with the uncrossed posture (t11 = 2.859, = 0.016 uncorrected; crossed – M = −1.5 μV; uncrossed – M = −1.3 μV) (Fig. 5). There was a weaker effect in the opposite direction for the contralateral P100–N140 complex (t11 = −1.894, = 0.085 uncorrected; crossed – M = −1.6 μV; uncrossed – M = −1.9 μV). Given that this component analysis indicates that the effect of posture in this experiment is only evident at frontal sites, we re-ran the sample-point by sample-point analysis just at frontal sites to gain a better estimate of the onset of posture effects in this experiment. This analysis confirmed that the effect of posture in the ipsilateral hemisphere started at 156 ms and was observed until the end of the interval tested (a sequence of consecutive significant t-tests over 36 ms in length was deemed significant by our Monte Carlo simulation), while no effects were observed for the contralateral difference waveform. The mean first-order autocorrelation at lag 1 (estimated in our data, and used for our Monte Carlo simulations) was 0.98 for the ipsilateral dataset and 0.99 for the contralateral dataset.

In the time-window between 180 and 400 ms post-stimulus, a main effect of Posture was obtained (F1,11 = 11.243, = 0.006), indicating that the deflection was more positive to uncrossed (= 0.44 μV) than to crossed (= 0.28 μV) posture. An interaction of Electrode Site × Hemisphere (F2,22 = 5.280, = 0.013) was also found.

Comparing the lateralization of posture effects across experiments

The analyses reported in Experiments 1 and 2 indicate that posture effects occurred in different hemispheres according to whether participants had sight of their hands. When the participants' hands were hidden, posture effects shifted from the contralateral hemisphere (Exp. 1; sight of hands) to the ipsilateral hemisphere (Exp. 2; no sight of hands). Differences in the waveforms observed across the two experiments make it difficult to investigate this interaction via component-based comparisons (in Experiment 1, P100 and N140 components were separate, whereas in Experiment 2 they were fused). Therefore, we continued to use a sample-point-based approach to examining the interaction of Posture × Hemisphere × Experiment. To do this, we calculated the contrast waveforms representing the Posture × Hemisphere × Experiment interaction for each sample-point and participant. Again, following Guthrie & Buchwald (1991), we carried out further Monte Carlo simulations to test this interaction. The only change to the method which we used for examining the posture effects within each separate experiment was that the analysis was now based on independent-samples t-tests which compared the Posture (Uncrossed-hands ‘UnX’ and Crossed-hands ‘X’) × Hemisphere (Contralateral ‘Con’ and Ipsilateral ‘Ipsi’) contrast waveforms observed in Experiment 1 vs. Experiment 2; such t-tests equate to the three-way interaction between Experiment, Posture and Hemisphere. Figure 6 shows the time course of this three-way interaction according to the specific subtractive contrast:

  • display math
image

Figure 6. The time course of the three-way interaction of Posture, Hemisphere and Experiment, which was calculated in three steps. First, the difference between posture conditions was calculated separately for each hemisphere and experiment (Uncrossed-hands − Crossed-hands; [UnX − X]). These data were then used to calculate the difference waveforms between contralateral and ipsilateral hemispheres for each experiment ([Con[UnX − X]] − [Ipsi[UnX − X]]). The two resulting waveforms were then subtracted to produce the waveforms for the three-way interaction: [Exp1[Con[UnX − X]]] − [Exp1[Ipsi[UnX − X]]] − [Exp2[Con[UnX − X]]] − [Exp2[Ipsi[UnX − X]]]. The vertical dashed line indicates the onset of the interval during which the difference waves deviate significantly from zero, i.e. at 152 ms.

Download figure to PowerPoint

Positive values of this contrast occur when posture effects are relatively more contralaterally distributed in Experiment 1 and relatively more ipsilaterally distributed in Experiment 2. The vertical dashed line in Fig. 6 shows the onset of the significant interval. Thus, a significant effect of sight of the limbs (the variable manipulated between the two experiments) on the laterality of postural remapping started at 152 ms and was observed until the end of the interval tested, i.e. 200 ms (a sequence of consecutive significant t-tests, all < 0.05, over 38 ms in length was deemed significant by our Monte Carlo simulation). The mean first-order autocorrelation at lag 1 (estimated in our data, and used for our Monte Carlo simulations) was 0.97 for this analysis.

This interaction reflects the different hemispheric distribution of postural effects observed in the two experiments reported above, and confirms that when participants have sight of the hands the first effect of posture on the SEP is observed over contralateral sites (Exp. 1), whereas when participants do not have sight of their hands the first effect of posture is observed over ipsilateral sites (Exp. 2).

General discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Experiment 1
  5. Experiment 2
  6. General discussion
  7. Summary and conclusion
  8. Acknowledgements
  9. References

Keeping track of the layout of one's body and limbs is of central importance, not just to guide action, but also in making sense of the multisensory environment (see Holmes & Spence, 2004; Bremner et al., 2008). Without processes of remapping across changes in body posture (i.e. processes which take account of movements of the limbs, the head or even the eyes in their sockets; see Pöppel, 1973), we would be hard-pressed to comprehend the spatial correspondences between stimuli which arise from the same objects, but which arrive to the brain through different sensory channels. Given the central importance of processes of postural remapping in sensory spatial representation, it is crucial to determine how and when these processes occur in the brain. To address these questions, the current study investigated how changes in body posture modulate the electrophysiological time course of somatosensory spatial processing. We also examined how different forms of sensory information about body posture are used in the remapping process; we examined the effect of proprioceptive and, in a separate experiment, proprioceptive and visual cues to hand posture on tactile spatial processing in the brain.

In two experiments, which differed only in the availability to participants of visual information about their hands and their current posture, we recorded SEPs elicited by vibrotactile stimuli to the palms in uncrossed-hands and crossed-hands postures. Across both of these experiments, crossing the hands over the midline produced statistically reliable effects from 128 and 150 ms in Experiments 1 and 2, respectively, thus influencing primarily the SEPs in the N140 time window. The excellent temporal resolution of ERPs allows us to conclude with more certainty than is offered by behavioural paradigms (Azañón & Soto-Faraco, 2008; Overvliet et al., 2011) exactly when remapping processes begin.

Previous ERP investigations of somatosensory representation across changes in body posture have focused on the effects of posture on the modulation of ERPs by voluntary attention. In these studies participants are instructed to attend to one stimulus location and actively ignore somatosensory stimuli presented at other locations (e.g. Eimer et al., 2001, 2003; Heed & Röder, 2010; Eardley & Van Velzen, 2011). These studies have shown that modulations of SEP components by voluntary attention occur later and are reduced when the hands are crossed (Eimer et al., 2003; Heed & Röder, 2010; Eardley & Van Velzen, 2011), and this has typically been interpreted as reflecting a disturbance of processes of voluntary attention to a location on the body caused by conflicts between anatomical and external reference frames for locating tactile stimuli (see, for example, Eimer et al., 2003). Crucially, in our study, no instruction to focus attention on a particular hand was given, and the locations of the tactile stimuli were unpredictable. This enables us to demonstrate the electrophysiological onset of somatosensory remapping as it occurs independently of processes of voluntary spatial attention.

One previous study, by Heed & Röder (2010), has explored the effects of posture on processing of tactile stimuli which are not being attended to. In one part of this larger study Heed and Röder examined effects of posture and attention on ERPs elicited by stimuli to the hands. Examining trials in which participants were explicitly instructed to focus attention on one hand and to ignore stimuli presented on the unattended hand, Heed and Röder observed a reduction of early ERP amplitudes in response to stimuli presented to the unattended hand when the hands were crossed. However, voluntary attention is still very much at play in these effects; the participants were asked to direct their attention to the hand on which the stimulus was not being presented. Indeed, the authors interpreted the effect of posture in this particular condition as being due to voluntary attention being directed (in the crossed-hands posture) towards a location in which the attended tactile stimulus would have occurred should the hands have been in the more familiar uncrossed posture.

It is interesting that, despite the differences between the instructions given in previous studies (to attend to or away from the stimulated hand) and the studies reported here (no instruction to attend given), the observed time courses of postural effects are quite comparable (onsets at about 130 ms). This might suggest that manipulations of voluntary attention do little to speed the process of remapping somatosensory stimuli from anatomical to external spatial coordinates. This possibility is certainly consistent with accounts of somatosensory processing which have characterized the early anatomically based stages of processing as automatic and unconscious (Kitazawa, 2002; Azañón & Soto-Faraco, 2008).

In the study reported here we compared somatosensory processing under conditions in which information about arm posture was provided either by both visual and proprioceptive cues in combination (Exp. 1) or by proprioceptive cues only (Exp. 2). Despite one morphological difference of note – that the P100 and N140 components, which were clearly dissociable in Experiment 1, could not be separately distinguished in Experiment 2 – the SEPs which we observed were largely similar between the two conditions. The effects of posture were observed within 25 ms of one another across the two experiments (Exp. 1 – 128 ms, Exp. 2 – 150 ms). The fact that postural effects can be observed under both of these conditions is consistent with the finding that neurons in primate premotor cortex will remap multisensory correspondences between touch and vision on the basis of both visual and proprioceptive cues to posture together and in isolation (e.g. Graziano, 1999).

However, the hemispheric distribution of the modulation of the SEPs by posture varied between experiments. When participants had sight of their hands as well as signals from proprioception (Exp. 1), an enhancement of the amplitude of the N140 when the hands were across the midline was observed over the contralateral but not the ipsilateral hemisphere. This effect reversed when the participants' limbs were covered (Exp. 2), with crossed-hands leading to an enhanced N140 recorded over the ipsilateral sites. Because of the differences between the time-windows which we used to compare the N140 across experiments (see above), we examined the Posture × Hemisphere × Experiment interaction with a sample-point by sample-point analysis using a Monte Carlo simulation method (based on Guthrie & Buchwald, 1991). This confirmed that hemispheric variation in posture effects according to the availability of vision of the hand occurred around the N140 component (from 152 ms). This hemispheric variation in posture effects coincides with some prior findings from an fMRI study by Lloyd et al. (2003).

Lloyd et al. (2003) demonstrated that a network of areas involved in the encoding of the hand across the midline (including parietal and premotor regions) switches hemispheres according to whether visual information about arm posture is available; brain activity associated with postural remapping was observed in the contralateral (i.e. left hemisphere) parietal and premotor areas when participants kept their eyes open, but ipsilateral (right) parietal areas when the eyes were closed. Our findings converge with these in suggesting that the neural activity associated with the location of the hand in a crossed-hands posture (i.e. the activity associated with an effect of posture) may switch hemispheres according to the sensory information available about the hand.

Why might visual information about hand posture lead to effects of posture being represented differently across hemispheres? Lloyd et al. (2003), on the basis of their fMRI findings, provide one explanation. They interpret posture effects in the BOLD (blood oxygen level-dependent) response to tactile stimuli as the neural representation of hand position, and argue that with only proprioceptive information about posture, the brain favours coding the hand with respect to an external spatial frame of reference. They suggest that when visual cues are made available in addition this strengthens the brain's use of an anatomical frame of reference.

On the surface, this interpretation may seem at odds with the findings by Röder et al. (2004), who report a study showing that use of an external frame of reference for localizing touch is dependent on visual experience in early life. They showed that sighted and late blind individuals are more affected by crossing their hands than congenitally blind individuals who grew up without vision from birth. However, it is important to draw a distinction between effects of current visual information on spatial coding, and effects of prolonged visual experience on spatial coding. Here we manipulate current visual information, and would argue that there is no conflict between: (i) current visual information leading to a greater weighting of an anatomical code in representations of hand position, and (ii) prolonged visual experience leading to an ability to locate a tactile stimulus in external spatial coordinates. It is also important to note that we are not arguing that in our study participants did not invoke an external reference frame for locating tactile stimuli when they had vision of their hands – indeed, they showed effects of posture both when they could (Exp. 1) and could not see their hands (Exp. 2). Rather, we interpret our results as showing that, irrespective of the spatial code for locating touch, the representation of hand position which mediated tactile localisation was weighted more towards an anatomical rather than an external reference frame. In that sense our findings are consistent with arguments that visual cues to the hand enhance an external code for tactile localization (Röder et al., 2004; Azañón & Soto-Faraco, 2007).

But why should the brain vary in the frames of reference it uses to encode the hand's location? Furthermore, why should it prefer to represent the location of the hand in anatomical coordinates when additional visual information is provided? We argue that the answers to these questions are to be found in a consideration of the role of haptic perception. The hand and its tactile receptors can function to locate objects and stimuli with respect to both the bodily location on which the stimulus impinges and the external locations (see Martin, 1995). It is possible that varying the kinds of information available concerning the body and external space might bias the brain towards or away from encoding touch with respect to one or another of these frames of reference. The richer and more reliable cues to the body which we receive when we look at it might bias processing of, or attract attention towards, the intrinsic spatial reference frames which play a role in representing location on the body surface. Thus, when the hands are visible, as well as felt through proprioception, their location, and the locations of the tactile stimuli upon them, may be more likely to be encoded with respect to anatomical coordinates. In line with this suggestion, recent research shows that vision of the hand modulates somatosensory processing (Forster & Eimer, 2005; Sambo et al., 2009; Longo et al., 2011) and also improves tactile acuity with respect to the body surface (Kennett et al., 2001; Fiorio & Haggard, 2005; Cardini et al., 2011).

Thus, we suggest that in our study, hand position (posture) effects were observed ipsilaterally in Experiment 2 (no sight of hands), because there were fewer cues to the anatomical location of the hands and to the tactile stimuli applied to them in this condition (i.e. just proprioceptive cues). When visual and proprioceptive cues were provided, this may have given more weight to an anatomical frame of reference, leading to hand position being encoded anatomically (i.e. via contralateral pathways).

Summary and conclusion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Experiment 1
  5. Experiment 2
  6. General discussion
  7. Summary and conclusion
  8. Acknowledgements
  9. References

The current experiments are the first to demonstrate the electrophysiological time course of somatosensory spatial remapping in the absence of manipulations of voluntary attention. The data reported here suggest that the process of remapping tactile locations according to the current posture of the limbs occurs from around 128 to 150 ms after stimulus onset (affecting primarily the somatosensory N140 component). Vision of the limbs plays an important role in the way that the brain processes posture. Sight of the limbs modulated the hemispheric distribution of activity associated with processing changes in the posture of the limbs. When there was no vision of the limbs, somatosensory remapping processes (postural effects on the N140) were observed over ipsilateral sites, but when participants could see their hands these processes appeared over contralateral sites. This provides a striking demonstration of how vision of the hand not only modulates tactile processing and acuity, but also the spatial frame of reference within which the position of the hands and tactile stimuli are encoded (i.e. the extent to which they are encoded with respect to the external environment or the anatomical frame of reference provided by the body).

Acknowledgements

  1. Top of page
  2. Abstract
  3. Introduction
  4. Experiment 1
  5. Experiment 2
  6. General discussion
  7. Summary and conclusion
  8. Acknowledgements
  9. References

This research was supported by an award from the European Research Council under the European Community's Seventh Framework Programme (FP7/2007-2013) (ERC Grant agreement no. 241242) to A.J.B. We acknowledge the kind assistance of the Centre for Brain and Cognitive Development, Birkbeck College, and Leslie Tucker in facilitating this research. We also extend our thanks to Elisa Carrus for her assistance in preparing Fig. 5.

Abbreviations
ERPs

event-related potentials

fMRI

functional magnetic resonance imaging

SEPs

somatosensory evoked potentials

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Experiment 1
  5. Experiment 2
  6. General discussion
  7. Summary and conclusion
  8. Acknowledgements
  9. References