Get access

Are surface properties integrated into visuohaptic object representations?

Authors

  • Simon Lacey,

    1. Department of Neurology, Emory University, WMB-6000, 101 Woodruff Circle, Atlanta, GA 30322, USA
    Search for more papers by this author
  • Jenelle Hall,

    1. Department of Neurology, Emory University, WMB-6000, 101 Woodruff Circle, Atlanta, GA 30322, USA
    Search for more papers by this author
  • K. Sathian

    1. Department of Neurology, Emory University, WMB-6000, 101 Woodruff Circle, Atlanta, GA 30322, USA
    2. Department of Rehabilitation Medicine, Emory University, WMB-6000, 101 Woodruff Circle, Atlanta, GA 30322, USA
    3. Department of Psychology, Emory University, WMB-6000, 101 Woodruff Circle, Atlanta, GA 30322, USA
    4. Rehabilitation R&D Center of Excellence, Atlanta VAMC, Decatur, GA, USA
    Search for more papers by this author

Dr S. Lacey, 1Department of Neurology, as above.
E-mail: slacey@emory.edu

Abstract

Object recognition studies have almost exclusively involved vision, focusing on shape rather than surface properties such as color. Visual object representations are thought to integrate shape and color information because changing the color of studied objects impairs their subsequent recognition. However, little is known about integration of surface properties into visuohaptic multisensory representations. Here, participants studied objects with distinct patterns of surface properties (color in Experiment 1, texture in Experiments 2 and 3) and had to discriminate between object shapes when color or texture schemes were altered in within-modal (visual and haptic) and cross-modal (visual study followed by haptic test and vice versa) conditions. In Experiment 1, color changes impaired within-modal visual recognition but had no effect on cross-modal recognition, suggesting that the multisensory representation was not influenced by modality-specific surface properties. In Experiment 2, texture changes impaired recognition in all conditions, suggesting that both unisensory and multisensory representations integrated modality-independent surface properties. However, the cross-modal impairment might have reflected either the texture change or a failure to form the multisensory representation. Experiment 3 attempted to distinguish between these possibilities by combining changes in texture with changes in orientation, taking advantage of the known view-independence of the multisensory representation, but the results were not conclusive owing to the overwhelming effect of texture change. The simplest account is that the multisensory representation integrates shape and modality-independent surface properties. However, more work is required to investigate this and the conditions under which multisensory integration of structural and surface properties occurs.

Ancillary