scholarly journals Human Frontal Eye Fields and Visual Search

2003 ◽  
Vol 89 (6) ◽  
pp. 3340-3343 ◽  
Author(s):  
Neil G. Muggleton ◽  
Chi-Hung Juan ◽  
Alan Cowey ◽  
Vincent Walsh

Recent physiological recording studies in monkeys have suggested that the frontal eye fields (FEFs) are involved in visual scene analysis even when eye movement commands are not required. We examined this proposed function of the human frontal eye fields during performance of visual search tasks in which difficulty was matched and eye movements were neither necessary nor required. Magnetic stimulation over FEF modulated performance on a conjunction search task and a simple feature search task in which the target was unpredictable from trial to trial, primarily by increasing false alarm responses. Simple feature search with a predictable target was not affected. The results establish that human FEFs are critical to visual selection, regardless of the need to generate a saccade command.

Author(s):  
Nathan Messmer ◽  
Nathan Leggett ◽  
Melissa Prince ◽  
Jason S. McCarley

Gaze linking allows team members in a collaborative visual task to scan separate computer monitors simultaneously while their eye movements are tracked and projected onto each other’s displays. The present study explored the benefits of gaze linking to performance in unguided and guided visual search tasks. Participants completed either an unguided or guided serial search task as both independent and gaze-linked searchers. Although it produced shorter mean response times than independent search, gaze linked search was highly inefficient, and gaze linking did not differentially affect performance in guided and unguided groups. Results suggest that gaze linking is likely to be of little value in improving applied visual search.


2017 ◽  
Author(s):  
Hoppe David ◽  
Constantin A. Rothkopf

AbstractThe capability of directing gaze to relevant parts in the environment is crucial for our survival. Computational models based on ideal-observer theory have provided quantitative accounts of human gaze selection in a range of visual search tasks. According to these models, gaze is directed to the position in a visual scene, at which uncertainty about task relevant properties will be reduced maximally with the next look. However, in tasks going beyond a single action, delayed rewards can play a crucial role thereby necessitating planning. Here we investigate whether humans are capable of planning more than the next single eye movement. We found evidence that our subjects’ behavior was better explained by an ideal planner compared to the ideal observer. In particular, the location of the first fixation differed depending on the stimulus and the time available for the search. Overall, our results are the first evidence that our visual system is capable of planning.


2014 ◽  
Vol 111 (4) ◽  
pp. 705-714 ◽  
Author(s):  
Indra T. Mahayana ◽  
Chia-Lun Liu ◽  
Chi Fu Chang ◽  
Daisy L. Hung ◽  
Ovid J. L. Tzeng ◽  
...  

Near- and far-space coding in the human brain is a dynamic process. Areas in dorsal, as well as ventral visual association cortex, including right posterior parietal cortex (rPPC), right frontal eye field (rFEF), and right ventral occipital cortex (rVO), have been shown to be important in visuospatial processing, but the involvement of these areas when the information is in near or far space remains unclear. There is a need for investigations of these representations to help explain the pathophysiology of hemispatial neglect, and the role of near and far space is crucial to this. We used a conjunction visual search task using an elliptical array to investigate the effects of transcranial magnetic stimulation delivered over rFEF, rPPC, and rVO on the processing of targets in near and far space and at a range of horizontal eccentricities. As in previous studies, we found that rVO was involved in far-space search, and rFEF was involved regardless of the distance to the array. It was found that rPPC was involved in search only in far space, with a neglect-like effect when the target was located in the most eccentric locations. No effects were seen for any site for a feature search task. As the search arrays had higher predictability with respect to target location than is often the case, these data may form a basis for clarifying both the role of PPC in visual search and its contribution to neglect, as well as the importance of near and far space in these.


1985 ◽  
Vol 60 (1) ◽  
pp. 191-200
Author(s):  
Hideoki Tada ◽  
Shoichi Iwasaki

Two experiments were carried out to examine the relationship between eyeblinks and eye movements under a visual search task. Exp. I showed that the vertical eye movements brought about slightly more eyeblinks than the horizontal ones. In Exp. II, the vertical eye movements were accompanied with significantly more frequent eyeblinks than the horizontal ones. Upward saccadic eye movements especially were associated with the more frequent eyeblinks than the downward ones. These results suggested a possible relationship between the eyeblinks and Bell's phenomenon. However, the comparison of eyeblink rates between eye-movement and the no-eye-movement conditions in Exp. II indicated that in the latter condition eyeblinks were significantly more frequent than in the former condition. Some psychological factors were suggested as likely important determinants of the frequency of eyeblinks.


2021 ◽  
Author(s):  
Heida Maria Sigurdardottir ◽  
Hilma Ros Omarsdóttir ◽  
Anna Sigridur Valgeirsdottir

Attention has been hypothesized to act as a sequential gating mechanism for the orderly processing of letters in words. These same visuo-attentional processes are assumed to partake in some but not all visual search tasks. In the current study, 60 adults with varying degrees of reading abilities, ranging from expert readers to severely impaired dyslexic readers, completed an attentionally demanding visual conjunction search task thought to heavily rely on the dorsal visual stream. A visual feature search task served as an internal control. According to the dorsal view of dyslexia, reading problems should go hand in hand with specific problems in visual conjunction search – particularly elevated conjunction search slopes (time per search item) – which would be interpreted as a problem with visual attention. Results showed that reading problems were associated with slower visual search, especially conjunction search. However, problems with reading were not associated with increased conjunction search slopes but instead with increased conjunction search intercepts, traditionally not interpreted as reflecting attentional processes. Our data are hard to reconcile with hypothesized problems in dyslexia with the serial moving of an attentional spotlight across a visual scene or a page of text.


2019 ◽  
Author(s):  
Michelle Ramey ◽  
Andrew P. Yonelinas ◽  
John M. Henderson

A hotly debated question is whether memory influences attention through conscious or unconscious processes. To address this controversy, we measured eye movements while participants searched repeated real-world scenes for embedded targets, and we assessed memory for each scene using confidence-based methods to isolate different states of subjective memory awareness. We found that memory-informed eye movements during visual search were predicted both by conscious recollection, which led to a highly precise first eye movement toward the remembered location, and by unconscious memory, which increased search efficiency by gradually directing the eyes toward the target throughout the search trial. In contrast, these eye movement measures were not influenced by familiarity-based memory (i.e., changes in subjective reports of memory strength). The results indicate that conscious recollection and unconscious memory can each play distinct and complementary roles in guiding attention to facilitate efficient extraction of visual information.


2018 ◽  
Author(s):  
Alasdair D F Clarke ◽  
Jessica Irons ◽  
Warren James ◽  
Andrew B. Leber ◽  
Amelia R. Hunt

A striking range of individual differences has recently been reported in three different visual search tasks. These differences in performance can be attributed to strategy, that is, the efficiency with which participants control their search to complete the task quickly and accurately. Here we ask if an individual's strategy and performance in one search task is correlated with how they perform in the other two. We tested 64 observers in the three tasks mentioned above over two sessions. Even though the test-retest reliability of the tasks is high, an observer's performance and strategy in one task did not reliably predict their behaviour in the other two. These results suggest search strategies are stable over time, but context-specific. To understand visual search we therefore need to account not only for differences between individuals, but also how individuals interact with the search task and context. These context-specific but stable individual differences in strategy can account for a substantial proportion of variability in search performance.


2012 ◽  
Vol 25 (0) ◽  
pp. 158
Author(s):  
Pawel J. Matusz ◽  
Martin Eimer

We investigated whether top-down attentional control settings can specify task-relevant features in different sensory modalities (vision and audition). Two audiovisual search tasks were used where a spatially uninformative visual singleton cue preceded a target search array. In different blocks, participants searched for a visual target (defined by colour or shape in Experiments 1 and 2, respectively), or target defined by a combination of visual and auditory features (e.g., red target accompanied by a high-pitch tone). Spatial cueing effects indicative of attentional capture by target-matching visual singleton cues in the unimodal visual search task were reduced or completely eliminated when targets were audiovisually defined. The N2pc component (i.e. index attentional target selection in vision) triggered by these cues was reduced and delayed during search for audiovisual as compared to unimodal visual targets. These results provide novel evidence that the top-down control settings which guide attentional selectivity can include perceptual features from different sensory modalities.


i-Perception ◽  
10.1068/ii44 ◽  
2014 ◽  
Vol 5 (5) ◽  
pp. 475-475
Author(s):  
K.M.A Mitchell ◽  
B.W Tatler

Sign in / Sign up

Export Citation Format

Share Document