At the Core of Feature Integration Theory

Author(s):  
William Prinzmetal
2018 ◽  
Author(s):  
Edwin S. Dalmaijer ◽  
Sanjay G. Manohar ◽  
Masud Husain

AbstractHumans can temporarily retain information in their highly limited short-term memory. Traditionally, objects are thought to be attentionally selected and committed to short-term memory one-by-one. However, few studies directly test this serial encoding assumption. Here, we demonstrate that information from separate objects can be encoded into short-term memory in parallel. We developed models of serial and parallel encoding that describe probabilities of items being present in short-term memory throughout the encoding process, and tested them in a whole-report design. Empirical data from four experiments in healthy individuals were fitted best by the parallel encoding model, even when items were presented unilaterally (processed within one hemisphere). Our results demonstrate that information from several items can be attentionally selected and consequently encoded into short-term memory simultaneously. This suggests the popular feature integration theory needs to be reformulated to account for parallel encoding, and provides important boundaries for computational models of short-term memory.


2019 ◽  
Vol 82 (2) ◽  
pp. 533-549 ◽  
Author(s):  
Josephine Reuther ◽  
Ramakrishna Chakravarthi ◽  
Amelia R. Hunt

AbstractFeature integration theory proposes that visual features, such as shape and color, can only be combined into a unified object when spatial attention is directed to their location in retinotopic maps. Eye movements cause dramatic changes on our retinae, and are associated with obligatory shifts in spatial attention. In two experiments, we measured the prevalence of conjunction errors (that is, reporting an object as having an attribute that belonged to another object), for brief stimulus presentation before, during, and after a saccade. Planning and executing a saccade did not itself disrupt feature integration. Motion did disrupt feature integration, leading to an increase in conjunction errors. However, retinal motion of an equal extent but caused by saccadic eye movements is spared this disruption, and showed similar rates of conjunction errors as a condition with static stimuli presented to a static eye. The results suggest that extra-retinal signals are able to compensate for the motion caused by saccadic eye movements, thereby preserving the integrity of objects across saccades and preventing their features from mixing or mis-binding.


Sign in / Sign up

Export Citation Format

Share Document