What Can Patterns of Eye Movements Tell Us About the Role of Memory in Visual Search?

2004 ◽  
Author(s):  
Jiye Shen ◽  
Eyal M. Reingold
Keyword(s):  
2018 ◽  
Vol 18 (13) ◽  
pp. 11 ◽  
Author(s):  
Sage E. P. Boettcher ◽  
Dejan Draschkow ◽  
Eric Dienhart ◽  
Melissa L.-H. Võ
Keyword(s):  

Vision ◽  
2019 ◽  
Vol 3 (3) ◽  
pp. 46
Author(s):  
Alasdair D. F. Clarke ◽  
Anna Nowakowska ◽  
Amelia R. Hunt

Visual search is a popular tool for studying a range of questions about perception and attention, thanks to the ease with which the basic paradigm can be controlled and manipulated. While often thought of as a sub-field of vision science, search tasks are significantly more complex than most other perceptual tasks, with strategy and decision playing an essential, but neglected, role. In this review, we briefly describe some of the important theoretical advances about perception and attention that have been gained from studying visual search within the signal detection and guided search frameworks. Under most circumstances, search also involves executing a series of eye movements. We argue that understanding the contribution of biases, routines and strategies to visual search performance over multiple fixations will lead to new insights about these decision-related processes and how they interact with perception and attention. We also highlight the neglected potential for variability, both within and between searchers, to contribute to our understanding of visual search. The exciting challenge will be to account for variations in search performance caused by these numerous factors and their interactions. We conclude the review with some recommendations for ways future research can tackle these challenges to move the field forward.


2007 ◽  
Vol 60 (7) ◽  
pp. 924-935 ◽  
Author(s):  
Thomas Geyer ◽  
Adrian Von Mühlenen ◽  
Hermann J. Müller

Horowitz and Wolfe (1998, 2003) have challenged the view that serial visual search involves memory processes that keep track of already inspected locations. The present study used a search paradigm similar to Horowitz and Wolfe's (1998), comparing a standard static search condition with a dynamic condition in which display elements changed locations randomly every 111 ms. In addition to measuring search reaction times, observers’ eye movements were recorded. For target-present trials, the search rates were near-identical in the two search conditions, replicating Horowitz and Wolfe's findings. However, the number of fixations and saccade amplitude were larger in the static than in the dynamic condition, whereas fixation duration and the latency of the first saccade were longer in the dynamic condition. These results indicate that an active, memory-guided search strategy was adopted in the static condition, and a passive “sit-and-wait” strategy in the dynamic condition.


2010 ◽  
Vol 104 (4) ◽  
pp. 2187-2193 ◽  
Author(s):  
Angela L. Gee ◽  
Anna E. Ipata ◽  
Michael E. Goldberg

We constantly make eye movements to bring objects of interest onto the fovea for more detailed processing. Activity in area V4, a prestriate visual area, is enhanced at the location corresponding to the target of an eye movement. However, the precise role of activity in V4 in relation to these saccades and the modulation of other cortical areas in the oculomotor system remains unknown. V4 could be a source of visual feature information used to select the eye movement, or alternatively, it could reflect the locus of spatial attention. To test these hypotheses, we trained monkeys on a visual search task in which they were free to move their eyes. We found that activity in area V4 reflected the direction of the upcoming saccade but did not predict the latency of the saccade in contrast to activity in the lateral intraparietal area (LIP). We suggest that the signals in V4, unlike those in LIP, are not directly involved in the generation of the saccade itself but rather are more closely linked to visual perception and attention. Although V4 and LIP have different roles in spatial attention and preparing eye movements, they likely perform complimentary processes during visual search.


2007 ◽  
Vol 69 (7) ◽  
pp. 1204-1217 ◽  
Author(s):  
Wieske Van Zoest ◽  
Alejandro Lleras ◽  
Alan Kingstone ◽  
James T. Enns
Keyword(s):  

10.1167/8.1.7 ◽  
2008 ◽  
Vol 8 (1) ◽  
pp. 7 ◽  
Author(s):  
Gregor Hardiess ◽  
Sabine Gillner ◽  
Hanspeter A. Mallot
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document