Grapheme–Color Synesthesia Influences Overt Visual Attention

2009 ◽  
Vol 21 (2) ◽  
pp. 246-258 ◽  
Author(s):  
Jonathan S. A. Carriere ◽  
Daniel Eaton ◽  
Michael G. Reynolds ◽  
Mike J. Dixon ◽  
Daniel Smilek

For individuals with grapheme–color synesthesia, achromatic letters and digits elicit vivid perceptual experiences of color. We report two experiments that evaluate whether synesthesia influences overt visual attention. In these experiments, two grapheme–color synesthetes viewed colored letters while their eye movements were monitored. Letters were presented in colors that were either congruent or incongruent with the synesthetes' colors. Eye tracking analysis showed that synesthetes exhibited a color congruity bias—a propensity to fixate congruently colored letters more often and for longer durations than incongruently colored letters—in a naturalistic free-viewing task. In a more structured visual search task, this congruity bias caused synesthetes to rapidly fixate and identify congruently colored target letters, but led to problems in identifying incongruently colored target letters. The results are discussed in terms of their implications for perception in synesthesia.

2021 ◽  
Vol 12 ◽  
Author(s):  
Leah R. Enders ◽  
Robert J. Smith ◽  
Stephen M. Gordon ◽  
Anthony J. Ries ◽  
Jonathan Touryan

Eye tracking has been an essential tool within the vision science community for many years. However, the majority of studies involving eye-tracking technology employ a relatively passive approach through the use of static imagery, prescribed motion, or video stimuli. This is in contrast to our everyday interaction with the natural world where we navigate our environment while actively seeking and using task-relevant visual information. For this reason, an increasing number of vision researchers are employing virtual environment platforms, which offer interactive, realistic visual environments while maintaining a substantial level of experimental control. Here, we recorded eye movement behavior while subjects freely navigated through a rich, open-world virtual environment. Within this environment, subjects completed a visual search task where they were asked to find and count occurrence of specific targets among numerous distractor items. We assigned each participant into one of four target conditions: Humvees, motorcycles, aircraft, or furniture. Our results show a statistically significant relationship between gaze behavior and target objects across Target Conditions with increased visual attention toward assigned targets. Specifically, we see an increase in the number of fixations and an increase in dwell time on target relative to distractor objects. In addition, we included a divided attention task to investigate how search changed with the addition of a secondary task. With increased cognitive load, subjects slowed their speed, decreased gaze on objects, and increased the number of objects scanned in the environment. Overall, our results confirm previous findings and support that complex virtual environments can be used for active visual search experimentation, maintaining a high level of precision in the quantification of gaze information and visual attention. This study contributes to our understanding of how individuals search for information in a naturalistic (open-world) virtual environment. Likewise, our paradigm provides an intriguing look into the heterogeneity of individual behaviors when completing an un-timed visual search task while actively navigating.


i-Perception ◽  
10.1068/ii44 ◽  
2014 ◽  
Vol 5 (5) ◽  
pp. 475-475
Author(s):  
K.M.A Mitchell ◽  
B.W Tatler

2008 ◽  
Vol 69 (3) ◽  
pp. 140 ◽  
Author(s):  
A.V. Latanov ◽  
N.S. Konovalova ◽  
A.A. Yermachenko

2016 ◽  
Vol 22 (7) ◽  
pp. 695-704 ◽  
Author(s):  
Krista Schendel ◽  
Nina F. Dronkers ◽  
And U. Turken

AbstractObjectives: Imbalances in spatial attention are most often associated with right hemisphere brain injury. This report assessed 25 chronic left hemisphere stroke patients for attentional bias. Methods: Participants were evaluated with a computerized visual search task and a standardized neuropsychological assessment known as the Behavioral Inattention Test (BITC). Twenty age-matched controls were also tested. Results: Although little to no attentional impairment was observed on the BITC, the computerized visual search task revealed statistically significant contralesional attentional impairment in the left hemisphere stroke group. Specifically, these participants required 208 ms more viewing time, on average, to reliably detect visual targets on the right side of the display compared to detection on the left side, while controls showed a difference of only 8 ms between the two sides. Conclusions: The observation of significant leftward visuospatial bias in this chronic stroke group provides further evidence that the left hemisphere also plays a role in the balance of visual attention across space. These results have implications for left hemisphere patients who are often not screened for visuospatial problems, as well as for theories of visual attention which have primarily emphasized the role of the right hemisphere. (JINS, 2016, 22, 695–704)


2021 ◽  
Author(s):  
Leah R. Enders ◽  
Robert J. Smith ◽  
Stephen M. Gordon ◽  
Anthony J. Ries ◽  
Jonathan Touryan

Eye tracking has been an essential tool within the vision science community for many years. However, the majority of studies involving eye-tracking technology employ a relatively passive approach through the use of static imagery, prescribed motion, or video stimuli. This is in contrast to our everyday interaction with the natural world where we navigate our environment while actively seeking and using task-relevant visual information. For this reason, vision researchers are beginning to use virtual environment platforms, which offer interactive, realistic visual environments while maintaining a substantial level of experimental control. Here, we recorded eye movement behavior while participants freely navigated through a complex virtual environment. Within this environment, participants completed a visual search task where they were asked to find and count occurrence of specific targets among numerous distractor items. We assigned each participant into one of four target groups: Humvees, motorcycles, aircraft, or furniture. Our results show a significant relationship between gaze behavior and target objects across subject groups. Specifically, we see an increased number of fixations and increase dwell time on target relative to distractor objects. In addition, we included a divided attention task to investigate how search patterns changed with the addition of a secondary task. With increased cognitive load, subjects slowed their speed, decreased gaze on objects, and increased the number of objects scanned in the environment. Overall, our results confirm previous findings from more controlled laboratory settings and demonstrate that complex virtual environments can be used for active visual search experimentation, maintaining a high level of precision in the quantification of gaze information and visual attention. This study contributes to our understanding of how individuals search for information in a naturalistic virtual environment. Likewise, our paradigm provides an intriguing look into the heterogeneity of individual behaviors when completing an un-timed visual search task while actively navigating.


2014 ◽  
Author(s):  
Daniel Buttaccio ◽  
Nicholas D. Lange ◽  
Rick P. Thomas ◽  
Michael Dougherty

Sign in / Sign up

Export Citation Format

Share Document