scholarly journals Guided Search 6.0: An updated model of visual search

Author(s):  
Jeremy M. Wolfe
Keyword(s):  
2021 ◽  
pp. 1-13
Author(s):  
Christine Salahub ◽  
Stephen M. Emrich

Abstract When searching for a target, it is possible to suppress the features of a known distractor. This suppression may prevent distractor processing altogether or only after the distractor initially captures attention (i.e., search and destroy). However, suppression may be impaired in individuals with attentional control deficits, such as in high anxiety. In this study (n = 48), we used ERPs to examine the time course of attentional enhancement and suppression when participants were given pretrial information about target or distractor features. Consistent with our hypothesis, we found that individuals with higher levels of anxiety had lower neural measures of suppressing the template-matching distractor, instead showing enhanced processing. These findings indicate that individuals with anxiety are more likely to use a search-and-destroy mechanism of negative templates—highlighting the importance of attentional control abilities in distractor-guided search.


Vision ◽  
2019 ◽  
Vol 3 (3) ◽  
pp. 46
Author(s):  
Alasdair D. F. Clarke ◽  
Anna Nowakowska ◽  
Amelia R. Hunt

Visual search is a popular tool for studying a range of questions about perception and attention, thanks to the ease with which the basic paradigm can be controlled and manipulated. While often thought of as a sub-field of vision science, search tasks are significantly more complex than most other perceptual tasks, with strategy and decision playing an essential, but neglected, role. In this review, we briefly describe some of the important theoretical advances about perception and attention that have been gained from studying visual search within the signal detection and guided search frameworks. Under most circumstances, search also involves executing a series of eye movements. We argue that understanding the contribution of biases, routines and strategies to visual search performance over multiple fixations will lead to new insights about these decision-related processes and how they interact with perception and attention. We also highlight the neglected potential for variability, both within and between searchers, to contribute to our understanding of visual search. The exciting challenge will be to account for variations in search performance caused by these numerous factors and their interactions. We conclude the review with some recommendations for ways future research can tackle these challenges to move the field forward.


Author(s):  
Ulrich Engelke ◽  
Andreas Duenser ◽  
Anthony Zeater

Selective attention is an important cognitive resource to account for when designing effective human-machine interaction and cognitive computing systems. Much of our knowledge about attention processing stems from search tasks that are usually framed around Treisman's feature integration theory and Wolfe's Guided Search. However, search performance in these tasks has mainly been investigated using an overt attention paradigm. Covert attention on the other hand has hardly been investigated in this context. To gain a more thorough understanding of human attentional processing and especially covert search performance, the authors have experimentally investigated the relationship between overt and covert visual search for targets under a variety of target/distractor combinations. The overt search results presented in this work agree well with the Guided Search studies by Wolfe et al. The authors show that the response times are considerably more influenced by the target/distractor combination than by the attentional search paradigm deployed. While response times are similar between the overt and covert search conditions, they found that error rates are considerably higher in covert search. They further show that response times between participants are stronger correlated as the search task complexity increases. The authors discuss their findings and put them into the context of earlier research on visual search.


Perception ◽  
10.1068/p2933 ◽  
2000 ◽  
Vol 29 (2) ◽  
pp. 241-250 ◽  
Author(s):  
Jiye Shen ◽  
Eyal M Reingold ◽  
Marc Pomplun

We examined the flexibility of guidance in a conjunctive search task by manipulating the ratios between different types of distractors. Participants were asked to decide whether a target was present or absent among distractors sharing either colour or shape. Results indicated a strong effect of distractor ratio on search performance. Shorter latency to move, faster manual response, and fewer fixations per trial were observed at extreme distractor ratios. The distribution of saccadic endpoints also varied flexibly as a function of distractor ratio. When there were very few same-colour distractors, the saccadic selectivity was biased towards the colour dimension. In contrast, when most of the distractors shared colour with the target, the saccadic selectivity was biased towards the shape dimension. Results are discussed within the framework of the guided search model.


1994 ◽  
Vol 1 (2) ◽  
pp. 202-238 ◽  
Author(s):  
Jeremy M. Wolfe
Keyword(s):  

Author(s):  
Jeremy M Wolfe

This paper provides a brief review of the Guided Search model of human visual search behavior. In the model, parallel processes analyze a limited number of basic visual features across large portions of the visual field. The output of these processes can be used to guide attention in the deployment of the limited-capacity processes that are capable of identifying more complex visual targets.


Author(s):  
David R. Perrott ◽  
John Cisneros ◽  
Richard L. Mckinley ◽  
William R. D'Angelo

We examined the minimum latency required to locate and identify a visual target (visual search) in a two-alternative forced-choice paradigm in which the visual target could appear from any azimuth (0° to 360°) and from a broad range of elevations (from 90° above to 70° below the horizon) relative to a person's initial line of gaze. Seven people were tested in six conditions: unaided search, three aurally aided search conditions, and two visually aided search conditions. Aurally aided search with both actual and virtual sound localization cues proved to be superior to unaided and visually guided search. Application of synthesized three dimensional and two-dimensional sound cues in the workstations are discussed.


Sign in / Sign up

Export Citation Format

Share Document