scholarly journals 0679 Impaired Visual Processing In Rbd Patients: A Visual Search Task Study

SLEEP ◽  
2018 ◽  
Vol 41 (suppl_1) ◽  
pp. A252-A252
Author(s):  
E Giora ◽  
A Galbiati ◽  
M Zucconi ◽  
L Ferini-Strambi
2017 ◽  
Vol 117 (1) ◽  
pp. 388-402 ◽  
Author(s):  
Michael A. Cohen ◽  
George A. Alvarez ◽  
Ken Nakayama ◽  
Talia Konkle

Visual search is a ubiquitous visual behavior, and efficient search is essential for survival. Different cognitive models have explained the speed and accuracy of search based either on the dynamics of attention or on similarity of item representations. Here, we examined the extent to which performance on a visual search task can be predicted from the stable representational architecture of the visual system, independent of attentional dynamics. Participants performed a visual search task with 28 conditions reflecting different pairs of categories (e.g., searching for a face among cars, body among hammers, etc.). The time it took participants to find the target item varied as a function of category combination. In a separate group of participants, we measured the neural responses to these object categories when items were presented in isolation. Using representational similarity analysis, we then examined whether the similarity of neural responses across different subdivisions of the visual system had the requisite structure needed to predict visual search performance. Overall, we found strong brain/behavior correlations across most of the higher-level visual system, including both the ventral and dorsal pathways when considering both macroscale sectors as well as smaller mesoscale regions. These results suggest that visual search for real-world object categories is well predicted by the stable, task-independent architecture of the visual system. NEW & NOTEWORTHY Here, we ask which neural regions have neural response patterns that correlate with behavioral performance in a visual processing task. We found that the representational structure across all of high-level visual cortex has the requisite structure to predict behavior. Furthermore, when directly comparing different neural regions, we found that they all had highly similar category-level representational structures. These results point to a ubiquitous and uniform representational structure in high-level visual cortex underlying visual object processing.


2012 ◽  
Vol 65 (6) ◽  
pp. 1068-1085 ◽  
Author(s):  
Gary Lupyan ◽  
Daniel Swingley

People often talk to themselves, yet very little is known about the functions of this self-directed speech. We explore effects of self-directed speech on visual processing by using a visual search task. According to the label feedback hypothesis (Lupyan, 2007a), verbal labels can change ongoing perceptual processing—for example, actually hearing “chair” compared to simply thinking about a chair can temporarily make the visual system a better “chair detector”. Participants searched for common objects, while being sometimes asked to speak the target's name aloud. Speaking facilitated search, particularly when there was a strong association between the name and the visual target. As the discrepancy between the name and the target increased, speaking began to impair performance. Together, these results speak to the power of words to modulate ongoing visual processing.


2006 ◽  
Vol 44 (8) ◽  
pp. 1137-1145 ◽  
Author(s):  
Oren Kaplan ◽  
Reuven Dar ◽  
Lirona Rosenthal ◽  
Haggai Hermesh ◽  
Mendel Fux ◽  
...  

2003 ◽  
Vol 41 (10) ◽  
pp. 1365-1386 ◽  
Author(s):  
Steven S. Shimozaki ◽  
Mary M. Hayhoe ◽  
Gregory J. Zelinsky ◽  
Amy Weinstein ◽  
William H. Merigan ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document