scholarly journals Visual Attention in The Fovea and The Periphery during Visual Search

2021 ◽  
Author(s):  
Jie Zhang ◽  
Xiaocang Zhu ◽  
Shanshan Wang ◽  
Hossein Esteky ◽  
Yonghong Tian ◽  
...  

Visual search depends on both the foveal and peripheral visual system, yet the foveal attention mechanisms is still lack of insights. We simultaneously recorded the foveal and peripheral activities in V4, IT and LPFC, while monkeys performed a category-based visual search task. Feature attention enhanced responses of Face-selective, House-selective, and Non-selective foveal cells in visual cortex. While foveal attention effects appeared no matter the peripheral attention effects, paying attention to the foveal stimulus dissipated the peripheral feature attentional effects, and delayed the peripheral spatial attentional effects. When target features appeared both in the foveal and the peripheral, feature attention effects seemed to occur predominately in the foveal, which might not distribute across the visual field according to common view of distributed feature attention effects. As a result, the parallel attentive process seemed to occur during distractor fixations, while the serial process predominated during target fixations in visual search.

2017 ◽  
Vol 117 (1) ◽  
pp. 388-402 ◽  
Author(s):  
Michael A. Cohen ◽  
George A. Alvarez ◽  
Ken Nakayama ◽  
Talia Konkle

Visual search is a ubiquitous visual behavior, and efficient search is essential for survival. Different cognitive models have explained the speed and accuracy of search based either on the dynamics of attention or on similarity of item representations. Here, we examined the extent to which performance on a visual search task can be predicted from the stable representational architecture of the visual system, independent of attentional dynamics. Participants performed a visual search task with 28 conditions reflecting different pairs of categories (e.g., searching for a face among cars, body among hammers, etc.). The time it took participants to find the target item varied as a function of category combination. In a separate group of participants, we measured the neural responses to these object categories when items were presented in isolation. Using representational similarity analysis, we then examined whether the similarity of neural responses across different subdivisions of the visual system had the requisite structure needed to predict visual search performance. Overall, we found strong brain/behavior correlations across most of the higher-level visual system, including both the ventral and dorsal pathways when considering both macroscale sectors as well as smaller mesoscale regions. These results suggest that visual search for real-world object categories is well predicted by the stable, task-independent architecture of the visual system. NEW & NOTEWORTHY Here, we ask which neural regions have neural response patterns that correlate with behavioral performance in a visual processing task. We found that the representational structure across all of high-level visual cortex has the requisite structure to predict behavior. Furthermore, when directly comparing different neural regions, we found that they all had highly similar category-level representational structures. These results point to a ubiquitous and uniform representational structure in high-level visual cortex underlying visual object processing.


2009 ◽  
Vol 21 (2) ◽  
pp. 246-258 ◽  
Author(s):  
Jonathan S. A. Carriere ◽  
Daniel Eaton ◽  
Michael G. Reynolds ◽  
Mike J. Dixon ◽  
Daniel Smilek

For individuals with grapheme–color synesthesia, achromatic letters and digits elicit vivid perceptual experiences of color. We report two experiments that evaluate whether synesthesia influences overt visual attention. In these experiments, two grapheme–color synesthetes viewed colored letters while their eye movements were monitored. Letters were presented in colors that were either congruent or incongruent with the synesthetes' colors. Eye tracking analysis showed that synesthetes exhibited a color congruity bias—a propensity to fixate congruently colored letters more often and for longer durations than incongruently colored letters—in a naturalistic free-viewing task. In a more structured visual search task, this congruity bias caused synesthetes to rapidly fixate and identify congruently colored target letters, but led to problems in identifying incongruently colored target letters. The results are discussed in terms of their implications for perception in synesthesia.


For web page designers it is important to consider how the visual components of a page affect how easy it is to use. Visual salience and clutter are two bottom-up factors of stimuli that have been shown to affect attentional guidance. Visual salience is a measure of how much a given item or region in the visual field stands out relative to its surroundings, and clutter is a measure of how much visual information is present and how well it is organized. In this study, we examined the effects of visual salience and clutter in a visual search task in e-commerce pages. Clutter was manipulated by adding grids of varying densities to the background of stimuli. On each trial, participants searched for an item that was either the most or least salient of the items on the page as determined by a computational model of visual salience (Itti, Koch, & Niebur, 1998). The results showed that the high salient targets were found faster than the low salient targets and search times also increased as clutter increased, but these two factors did not interact. We conclude that designers should consider both factors when possible.


2016 ◽  
Vol 22 (7) ◽  
pp. 695-704 ◽  
Author(s):  
Krista Schendel ◽  
Nina F. Dronkers ◽  
And U. Turken

AbstractObjectives: Imbalances in spatial attention are most often associated with right hemisphere brain injury. This report assessed 25 chronic left hemisphere stroke patients for attentional bias. Methods: Participants were evaluated with a computerized visual search task and a standardized neuropsychological assessment known as the Behavioral Inattention Test (BITC). Twenty age-matched controls were also tested. Results: Although little to no attentional impairment was observed on the BITC, the computerized visual search task revealed statistically significant contralesional attentional impairment in the left hemisphere stroke group. Specifically, these participants required 208 ms more viewing time, on average, to reliably detect visual targets on the right side of the display compared to detection on the left side, while controls showed a difference of only 8 ms between the two sides. Conclusions: The observation of significant leftward visuospatial bias in this chronic stroke group provides further evidence that the left hemisphere also plays a role in the balance of visual attention across space. These results have implications for left hemisphere patients who are often not screened for visuospatial problems, as well as for theories of visual attention which have primarily emphasized the role of the right hemisphere. (JINS, 2016, 22, 695–704)


Perception ◽  
1998 ◽  
Vol 27 (1) ◽  
pp. 21-33 ◽  
Author(s):  
Elizabeth Conlon ◽  
William Lovegrove ◽  
Trevor Hine ◽  
Eugene Chekaluk ◽  
Kerry Piatek ◽  
...  

Unpleasant somatic and perceptual side effects can be induced when viewing striped repetitive patterns, such as a square wave or a page of text. This sensitivity is greater in participants with higher scores on a scale of visual discomfort. In three experiments the effect that this sensitivity has on performance efficiency in a reading-like visual search task was investigated. In experiments 1 and 2, the ‘global’ structure of the patterns was manipulated to produce a square-wave, a checkerboard, and a plaid pattern. It was found that the group that suffered severe visual discomfort took significantly longer than other groups to perform the task, with interference greatest with presentation of the square-wave-like pattern. This supports the prediction of greatest distraction of visual attention from the local target elements with presentation of the pattern structure inducing greatest visual discomfort. In experiment 3, the internal pattern components were manipulated and task difficulty reduced. A no-interference and two interference patterns, one with a global characteristic only and the second made up of distracting line elements, containing global and local components were used. The global pattern structure produced interference effects on the visual-search task. All groups performed with the same speed and accuracy on the task involving the no-interference pattern, a finding attributed to reduced task difficulty McConkie and Zola's model of visual attention was used to explain these results.


Author(s):  
Kirsten C.S. Adam ◽  
John T. Serences

AbstractTo find important objects, we must focus on our goals, ignore distractions, and take our changing environment into account. This is formalized in models of visual search whereby goal-driven, stimulus-driven and history-driven factors are integrated into a priority map that guides attention. History is invoked to explain behavioral effects that are neither wholly goal-driven nor stimulus-driven, but whether history likewise alters goal-driven and/or stimulus-driven signatures of neural priority is unknown. We measured fMRI responses in human visual cortex during a visual search task where trial history was manipulated (colors switched unpredictably or repeated). History had a near-constant impact on responses to singleton distractors, but not targets, from V1 through parietal cortex. In contrast, history-independent target enhancement was absent in V1 but increased across regions. Our data suggest that history does not alter goal-driven search templates, but rather modulates canonically stimulus-driven sensory responses to create a temporally-integrated representation of priority.


2021 ◽  
Vol 12 ◽  
Author(s):  
Leah R. Enders ◽  
Robert J. Smith ◽  
Stephen M. Gordon ◽  
Anthony J. Ries ◽  
Jonathan Touryan

Eye tracking has been an essential tool within the vision science community for many years. However, the majority of studies involving eye-tracking technology employ a relatively passive approach through the use of static imagery, prescribed motion, or video stimuli. This is in contrast to our everyday interaction with the natural world where we navigate our environment while actively seeking and using task-relevant visual information. For this reason, an increasing number of vision researchers are employing virtual environment platforms, which offer interactive, realistic visual environments while maintaining a substantial level of experimental control. Here, we recorded eye movement behavior while subjects freely navigated through a rich, open-world virtual environment. Within this environment, subjects completed a visual search task where they were asked to find and count occurrence of specific targets among numerous distractor items. We assigned each participant into one of four target conditions: Humvees, motorcycles, aircraft, or furniture. Our results show a statistically significant relationship between gaze behavior and target objects across Target Conditions with increased visual attention toward assigned targets. Specifically, we see an increase in the number of fixations and an increase in dwell time on target relative to distractor objects. In addition, we included a divided attention task to investigate how search changed with the addition of a secondary task. With increased cognitive load, subjects slowed their speed, decreased gaze on objects, and increased the number of objects scanned in the environment. Overall, our results confirm previous findings and support that complex virtual environments can be used for active visual search experimentation, maintaining a high level of precision in the quantification of gaze information and visual attention. This study contributes to our understanding of how individuals search for information in a naturalistic (open-world) virtual environment. Likewise, our paradigm provides an intriguing look into the heterogeneity of individual behaviors when completing an un-timed visual search task while actively navigating.


Sign in / Sign up

Export Citation Format

Share Document