THE SEMANTICS OF LOCATIVE INFORMATION IN PICTURES AND MENTAL IMAGES

Authors


Abstract

Three experiments examined how people compare sentences about spatial location to pictures and images. Previous investigations have found that people are faster at judging relative location when the description contains the word above or right than when it contains the word below or left. Expt. I showed that this asymmetry persisted when the words were replaced by arrows, indicating that the effect is not specific to particular lexical items. Expt. II showed the asymmetry persisted even when the response latency did not include the time to encode the description, indicating that the asymmetry does not lie in the description-encoding stage. Finally, Expt. III investigated how people compare sentences to information from a previously memorized picture. In this situation, the usual asymmetry was not present. The three studies suggest that the asymmetry arises from the way descriptions influence the encoding of perceptual events. The results also showed that the information encoded in a mental representation of a picture is ordered such that certain features can be accessed more quickly than others. However, the same features are equally quickly accessed in a picture that is physically present.

Ancillary