scholarly journals Guided search: An alternative to the feature integration model for visual search.

Author(s):  
Jeremy M. Wolfe ◽  
Kyle R. Cave ◽  
Susan L. Franzel
2021 ◽  
pp. 1-13
Author(s):  
Christine Salahub ◽  
Stephen M. Emrich

Abstract When searching for a target, it is possible to suppress the features of a known distractor. This suppression may prevent distractor processing altogether or only after the distractor initially captures attention (i.e., search and destroy). However, suppression may be impaired in individuals with attentional control deficits, such as in high anxiety. In this study (n = 48), we used ERPs to examine the time course of attentional enhancement and suppression when participants were given pretrial information about target or distractor features. Consistent with our hypothesis, we found that individuals with higher levels of anxiety had lower neural measures of suppressing the template-matching distractor, instead showing enhanced processing. These findings indicate that individuals with anxiety are more likely to use a search-and-destroy mechanism of negative templates—highlighting the importance of attentional control abilities in distractor-guided search.


2009 ◽  
Vol 9 (5) ◽  
pp. 15-15 ◽  
Author(s):  
B. T. Vincent ◽  
R. J. Baddeley ◽  
T. Troscianko ◽  
I. D. Gilchrist

Vision ◽  
2019 ◽  
Vol 3 (3) ◽  
pp. 46
Author(s):  
Alasdair D. F. Clarke ◽  
Anna Nowakowska ◽  
Amelia R. Hunt

Visual search is a popular tool for studying a range of questions about perception and attention, thanks to the ease with which the basic paradigm can be controlled and manipulated. While often thought of as a sub-field of vision science, search tasks are significantly more complex than most other perceptual tasks, with strategy and decision playing an essential, but neglected, role. In this review, we briefly describe some of the important theoretical advances about perception and attention that have been gained from studying visual search within the signal detection and guided search frameworks. Under most circumstances, search also involves executing a series of eye movements. We argue that understanding the contribution of biases, routines and strategies to visual search performance over multiple fixations will lead to new insights about these decision-related processes and how they interact with perception and attention. We also highlight the neglected potential for variability, both within and between searchers, to contribute to our understanding of visual search. The exciting challenge will be to account for variations in search performance caused by these numerous factors and their interactions. We conclude the review with some recommendations for ways future research can tackle these challenges to move the field forward.


Author(s):  
Ulrich Engelke ◽  
Andreas Duenser ◽  
Anthony Zeater

Selective attention is an important cognitive resource to account for when designing effective human-machine interaction and cognitive computing systems. Much of our knowledge about attention processing stems from search tasks that are usually framed around Treisman's feature integration theory and Wolfe's Guided Search. However, search performance in these tasks has mainly been investigated using an overt attention paradigm. Covert attention on the other hand has hardly been investigated in this context. To gain a more thorough understanding of human attentional processing and especially covert search performance, the authors have experimentally investigated the relationship between overt and covert visual search for targets under a variety of target/distractor combinations. The overt search results presented in this work agree well with the Guided Search studies by Wolfe et al. The authors show that the response times are considerably more influenced by the target/distractor combination than by the attentional search paradigm deployed. While response times are similar between the overt and covert search conditions, they found that error rates are considerably higher in covert search. They further show that response times between participants are stronger correlated as the search task complexity increases. The authors discuss their findings and put them into the context of earlier research on visual search.


Perception ◽  
10.1068/p2933 ◽  
2000 ◽  
Vol 29 (2) ◽  
pp. 241-250 ◽  
Author(s):  
Jiye Shen ◽  
Eyal M Reingold ◽  
Marc Pomplun

We examined the flexibility of guidance in a conjunctive search task by manipulating the ratios between different types of distractors. Participants were asked to decide whether a target was present or absent among distractors sharing either colour or shape. Results indicated a strong effect of distractor ratio on search performance. Shorter latency to move, faster manual response, and fewer fixations per trial were observed at extreme distractor ratios. The distribution of saccadic endpoints also varied flexibly as a function of distractor ratio. When there were very few same-colour distractors, the saccadic selectivity was biased towards the colour dimension. In contrast, when most of the distractors shared colour with the target, the saccadic selectivity was biased towards the shape dimension. Results are discussed within the framework of the guided search model.


1994 ◽  
Vol 1 (2) ◽  
pp. 202-238 ◽  
Author(s):  
Jeremy M. Wolfe
Keyword(s):  

1993 ◽  
Vol 5 (4) ◽  
pp. 436-452 ◽  
Author(s):  
Martin Arguin ◽  
Yves Joanette ◽  
Patrick Cavanagh

Brain-damaged subjects who had previously been identified as suffering from a visual attention deficit for contralesional stimulation were tested on a series of visual search tasks. The experiments examined the hypothesis that the processing of single features is preattentive but that feature integration, necessary for the correct perception of conjunctions of features, requires attention (Treisman & Gelade, 1980 Treisman & Sato, 1990). Subjects searched for a feature target (orientation or color) or for a conjunction target (orientation and color) in unilateral displays in which the number of items presented was variable. Ocular fixation was controlled so that trials on which eye movements occurred were cancelled. While brain-damaged subjects with a visual attention disorder (VAD subjects) performed similarly to normal controls in feature search tasks, they showed a marked deficit in conjunction search. Specifically, VAD subjects exhibited an important reduction of their serial search rates for a conjunction target with contralesional displays. In support of Treisman's feature integration theory, a visual attention deficit leads to a marked impairment in feature integration whereas it does not appear to affect feature encoding.


Sign in / Sign up

Export Citation Format

Share Document