Testing the precision of spatial memory representations using a change-detection task: Effects of viewpoint change
There are at least two distinct ways in which the brain encodes spatial information: in egocentric representations locations are encoded relative to the observer, whereas in allocentric representations locations are encoded relative to the environment. Both inform spatial memory, but the extent to which they influence behaviour varies depending on the task. In the present study, two preregistered experiments used a psychophysical approach to measure the precision of spatial memory while varying ego- and allocentric task demands. Participants were asked to detect the changed location of one of four objects when seen from a new viewpoint (rotated by 0°, 5°, 15°, 45° or 135°). Experiment 1 used a Same/Different task and Experiment 2 used a 2AFC task. Psychophysical thresholds were calculated, showing that in both experiments, spatial change detection thresholds showed a monotonic but non-linear increase as viewpoint change increased. This was consistent with a preregistered model including distinct parameters corresponding to egocentric and allocentric contributions that change lawfully as a function of viewpoint shift. Our results provide a clearer understanding of how underlying memory representations interact to inform our spatial knowledge of the environment.