scholarly journals Value-driven interference in visual search: Attention to reward-associated distractors.

2021 ◽  
Author(s):  
Stephan Koenig ◽  
David Torrents-Rodas ◽  
Metin Üngör ◽  
Harald Lachnit

We used an implicit learning paradigm to examine the acquisition of color-reward associations when colors were task-irrelevant and attention to color was detrimental to performance. Our task required a manual classification response to a shape target and a correct response was rewarded with either 1 or 10 cent. The amount of reward was contingent on the color of a simultaneous color distractor and different colors were associated with low reward (always 1 Cent), partial reward (randomly either 1 or 10 Cent), and high reward (always 10 Cent). Attention to color was nonstrategic for maximizing reward because it interfered with the response to the target. We examined the potential of reward-associated colors to capture and hold overt attention automatically. Reward expectancy increased with the average amount of associated reward (low < partial < high). Reward uncertainty was highest for the partially reward distractor color (low < partial > high). Results revealed that capture frequency was linked to reward expectancy, while capture duration additionally seemed to be influenced by uncertainty, complementing previous findings of such a dissociation in appetitive and aversive learning (Koenig, Kadel, Uengoer, Schubö, & Lachnit, 2017; Koenig, Uengoer, & Lachnit, 2017).

2017 ◽  
Vol 17 (10) ◽  
pp. 83
Author(s):  
Yoko Higuchi ◽  
Terumasa Endo ◽  
Satoshi Inoue ◽  
Takatsune Kumada

2009 ◽  
Vol 102 (6) ◽  
pp. 3481-3491 ◽  
Author(s):  
Koorosh Mirpour ◽  
Fabrice Arcizet ◽  
Wei Song Ong ◽  
James W. Bisley

In everyday life, we efficiently find objects in the world by moving our gaze from one location to another. The efficiency of this process is brought about by ignoring items that are dissimilar to the target and remembering which target-like items have already been examined. We trained two animals on a visual foraging task in which they had to find a reward-loaded target among five task-irrelevant distractors and five potential targets. We found that both animals performed the task efficiently, ignoring the distractors and rarely examining a particular target twice. We recorded the single unit activity of 54 neurons in the lateral intraparietal area (LIP) while the animals performed the task. The responses of the neurons differentiated between targets and distractors throughout the trial. Further, the responses marked off targets that had been fixated by a reduction in activity. This reduction acted like inhibition of return in saliency map models; items that had been fixated would no longer be represented by high enough activity to draw an eye movement. This reduction could also be seen as a correlate of reward expectancy; after a target had been identified as not containing the reward the activity was reduced. Within a trial, responses to the remaining targets did not increase as they became more likely to yield a result, suggesting that only activity related to an event is updated on a moment-by-moment bases. Together, our data show that all the neural activity required to guide efficient search is present in LIP. Because LIP activity is known to correlate with saccade goal selection, we propose that LIP plays a significant role in the guidance of efficient visual search.


2018 ◽  
Author(s):  
Niklas Johannes ◽  
Jonas Dora ◽  
Dorottya Rusz

Smartphones have been shown to distract people from their main tasks (e.g., studying, working), but the psychological mechanisms underlying these distractions are not clear yet. In the current study, we tested whether the distracting nature of smartphones stems from their high associated (social) reward value. Participants (N = 117) performed a visual search task while they were distracted by (a) high social reward cues (e.g., Facebook app icon + notification sign), (b) low social reward cues (e.g., Facebook app icon), and (c) no social reward cues (e.g., Weather app icon). We further expected that the distraction effect would be more pronounced for participants who had been deprived of using their phone. Contrary to our hypothesis, we found that smartphone cues that were presumably associated with high (vs. low or no) social rewards did not impair visual search speed. Surprisingly, deprived participants were faster than non-deprived participants. These results indicate that mere smartphone app icons are not necessarily associated with rewards. However, the absence of a smartphone may increase motivation which again may boost performance.


2019 ◽  
Vol 82 (3) ◽  
pp. 1537-1537
Author(s):  
Francesco Cimminella ◽  
Sergio Della Sala ◽  
Moreno I. Coco

2011 ◽  
Vol 23 (9) ◽  
pp. 2231-2239 ◽  
Author(s):  
Carsten N. Boehler ◽  
Mircea A. Schoenfeld ◽  
Hans-Jochen Heinze ◽  
Jens-Max Hopf

Attention to one feature of an object can bias the processing of unattended features of that object. Here we demonstrate with ERPs in visual search that this object-based bias for an irrelevant feature also appears in an unattended object when it shares that feature with the target object. Specifically, we show that the ERP response elicited by a distractor object in one visual field is modulated as a function of whether a task-irrelevant color of that distractor is also present in the target object that is presented in the opposite visual field. Importantly, we find this modulation to arise with a delay of approximately 80 msec relative to the N2pc—a component of the ERP response that reflects the focusing of attention onto the target. In a second experiment, we demonstrate that this modulation reflects enhanced neural processing in the unattended object. These observations together facilitate the surprising conclusion that the object-based selection of irrelevant features is spatially global even after attention has selected the target object.


2018 ◽  
Vol 44 (5) ◽  
pp. 707-721 ◽  
Author(s):  
Tom Beesley ◽  
Gunadi Hanafi ◽  
Miguel A. Vadillo ◽  
David. R. Shanks ◽  
Evan J. Livesey

2020 ◽  
Author(s):  
Joseph MacInnes ◽  
Ómar I. Jóhannesson ◽  
Andrey Chetverikov ◽  
Arni Kristjansson

We move our eyes roughly three times every second while searching complex scenes, but covert attention helps to guide where we allocate those overt fixations. Covert attention may be allocated reflexively or voluntarily, and speeds the rate of information processing at the attended location. Reducing access to covert attention hinders performance, but it is not known to what degree the locus of covert attention is tied to the current gaze position. We compared visual search performance in a traditional gaze contingent display with a second task where a similarly sized contingent window is controlled with a mouse allowing a covert aperture to be controlled independently from overt gaze. Larger apertures improved performance for both mouse and gaze contingent trials suggesting that covert attention was beneficial regardless of control type. We also found evidence that participants used the mouse controlled aperture independently of gaze position, suggesting that participants attempted to untether their covert and overt attention when possible. This untethering manipulation, however, resulted in an overall cost to search performance, a result at odds with previous results in a change blindness paradigm. Untethering covert and overt attention may therefore have costs or benefits depending on the task demands in each case.


Sign in / Sign up

Export Citation Format

Share Document