An evaluation of the automaticity of sensory processing using event-related potentials and brain-stem reflexes



    1. Department of Psychology, University of Missouri–Columbia
    Search for more papers by this author
    • Address reprint requests to: Steve Hackley, Ph.D., Department of Psychology, 210 McAlester, University of Missouri–Columbia, Columbia, MO 65211, USA.

  • This report is based on an address given upon receipt of the Distinguished Early Career Contribution to Psychophysiology Award at the 32nd annual meeting of the Society for Psychophysiological Research, San Diego, California, October 1992.

  • I am very grateful to Frances Graham and Steven Hillyard for their support and guidance throughout my training. Their collaborative effort on the studies reviewed in this paper is gratefully acknowledged, as is that of Marty Woldorff. Thanks are also extended to Bruno Anthony, Jon Hansen, and George Mangun for helpful discussions, and to Kimmo Alho, Michael Coles, Nelson Cowan, Art Kramer, Marie-Hélène Giard, and Walter Ritter for constructive comments on the manuscript. This research was supported in part by grants to the author from the United States Public Health Service (F32-MH09281 and R29-MH47746) and the National Science Foundation (a graduate fellowship).


Selective attention effects on reflexes and evoked potentials are reviewed with the aim of evaluating three theories regarding sensory automaticity. (a) The peripheral-gating theory, which assumes that ignored stimuli can be filtered out soon after transduction, was tentatively rejected because neither auditory-nerve nor retinal potentials are reliably affected by attention. (b) At the other extreme, the assumption that sensory analyses are obligatory and cannot benefit from attentional resources (i.e., strong-automaticity theory) was also rejected, because longer latency componets were found to be modifiable by attention. (c) An intermediate theory provides the best fit to present electrophysiological data. The earliest sensory analyses are assumed to be strongly automatic and then, at forebrain levels, there is a transition from strong to weak automaticity (i.e., analyses are obligatory but modifiable by attention). This transition can begin as early as about 15 ms for audition and about 80 ms for vision.