Visual search for multiple targets gives no evidence of amnesic covert attention

2001 ◽  
Author(s):  
Jason S. McCarley ◽  
Matthew S. Peterson ◽  
Arthur F. Kramer ◽  
Ranxiao Frances Wang ◽  
David E. Irwin
Author(s):  
Athanasios Drigas ◽  
Maria Karyotaki

Motivation, affect and cognition are interrelated. However, the control of attentional deployment and more specifically, attempting to provide a more complete account of the interactions between the dorsal and ventral processing streams is still a challenge. The interaction between overt and covert attention is particularly important for models concerned with visual search. Further modeling of such interactions can assist to scrutinize many mechanisms, such as saccadic suppression, dynamic remapping of the saliency map and inhibition of return, covert pre-selection of targets for overt saccades and online understanding of complex visual scenes.


2010 ◽  
Vol 104 (5) ◽  
pp. 2433-2441 ◽  
Author(s):  
Richard P. Heitz ◽  
Jeremiah Y. Cohen ◽  
Geoffrey F. Woodman ◽  
Jeffrey D. Schall

The goal of this study was to obtain a better understanding of the physiological basis of errors of visual search. Previous research has shown that search errors occur when visual neurons in the frontal eye field (FEF) treat distractors as if they were targets. We replicated this finding during an inefficient form search and extended it by measuring simultaneously a macaque homologue of an event-related potential indexing the allocation of covert attention known as the m-N2pc. Based on recent work, we expected errors of selection in FEF to propagate to areas of extrastriate cortex responsible for allocating attention and implicated in the generation of the m-N2pc. Consistent with this prediction, we discovered that when FEF neurons selected a distractor instead of the search target, the m-N2pc shifted in the same, incorrect direction prior to the erroneous saccade. This suggests that such errors are due to a systematic misorienting of attention from the initial stages of visual processing. Our analyses also revealed distinct neural correlates of false alarms and guesses. These results demonstrate that errant gaze shifts during visual search arise from errant attentional processing.


2020 ◽  
Author(s):  
Joseph MacInnes ◽  
Ómar I. Jóhannesson ◽  
Andrey Chetverikov ◽  
Arni Kristjansson

We move our eyes roughly three times every second while searching complex scenes, but covert attention helps to guide where we allocate those overt fixations. Covert attention may be allocated reflexively or voluntarily, and speeds the rate of information processing at the attended location. Reducing access to covert attention hinders performance, but it is not known to what degree the locus of covert attention is tied to the current gaze position. We compared visual search performance in a traditional gaze contingent display with a second task where a similarly sized contingent window is controlled with a mouse allowing a covert aperture to be controlled independently from overt gaze. Larger apertures improved performance for both mouse and gaze contingent trials suggesting that covert attention was beneficial regardless of control type. We also found evidence that participants used the mouse controlled aperture independently of gaze position, suggesting that participants attempted to untether their covert and overt attention when possible. This untethering manipulation, however, resulted in an overall cost to search performance, a result at odds with previous results in a change blindness paradigm. Untethering covert and overt attention may therefore have costs or benefits depending on the task demands in each case.


2017 ◽  
Vol 10 (1) ◽  
pp. 38-52 ◽  
Author(s):  
E.S. Gorbunova

The article investigated the role of spatial working memory in visual search for multiple targets, in particular, in subsequent search misses effect. This phenomenon is the second target omission after the first target has been found in visual search task. One of the theoretical interpretations of subsequent search misses is the lack of resources (attention and/or working memory) after the first target is found. Experiment investigated dual-target visual search efficiency in standard conditions and with additional spatial working memory load. Additional working memory load did not have any significant impact in multiple target visual search efficiency. The results can due to the role of object, but not spatial working memory in this task. Alternative explanation assumes using special tools and strategies.


2017 ◽  
Vol 40 ◽  
Author(s):  
Johan Hulleman ◽  
Christian N. L. Olivers

AbstractWe proposed to abandon the item as conceptual unit in visual search and adopt a fixation-based framework instead. We treat various themes raised by our commentators, including the nature of the Functional Visual Field and existing similar ideas, alongside the importance of items, covert attention, and top-down/contextual influences. We reflect on the current state of, and future directions for, visual search.


Author(s):  
Ulrich Engelke ◽  
Andreas Duenser ◽  
Anthony Zeater

Selective attention is an important cognitive resource to account for when designing effective human-machine interaction and cognitive computing systems. Much of our knowledge about attention processing stems from search tasks that are usually framed around Treisman's feature integration theory and Wolfe's Guided Search. However, search performance in these tasks has mainly been investigated using an overt attention paradigm. Covert attention on the other hand has hardly been investigated in this context. To gain a more thorough understanding of human attentional processing and especially covert search performance, the authors have experimentally investigated the relationship between overt and covert visual search for targets under a variety of target/distractor combinations. The overt search results presented in this work agree well with the Guided Search studies by Wolfe et al. The authors show that the response times are considerably more influenced by the target/distractor combination than by the attentional search paradigm deployed. While response times are similar between the overt and covert search conditions, they found that error rates are considerably higher in covert search. They further show that response times between participants are stronger correlated as the search task complexity increases. The authors discuss their findings and put them into the context of earlier research on visual search.


2017 ◽  
Vol 40 ◽  
Author(s):  
Kyle R. Cave

AbstractSome previous accounts of visual search have emphasized covert attention at the expense of eye movements, and others have focused on eye movements while ignoring covert attention. Both selection mechanisms are likely to contribute to many searches, and a full account of search will probably need to explain how the two interact to find visual targets.


2021 ◽  
Vol 21 (9) ◽  
pp. 2984
Author(s):  
Anastasia Ahufrieva ◽  
Elena Gorbunova

2021 ◽  
pp. 153-190
Author(s):  
Richard E. Passingham

The caudal prefrontal (PF) cortex supports the visual search for objects such as foods both through eye movements and covert attention, and its connections explain how it can do this. The caudal PF cortex, which includes the frontal eye field, has connections with both the dorsal and ventral visual streams. The direction of eye movements depends on its connections with the superior colliculus and oculomotor nuclei. Covert attention depends on enhanced sensory responses that are mediated through top-down interactions with posterior sensory areas. Along with the granular parts of the orbital PF cortex, the caudal PF cortex evolved in early primates. Together, these two new areas led to improvements in searching for and evaluating objects that are hidden in a cluttered environment.


Sign in / Sign up

Export Citation Format

Share Document