free viewing
Recently Published Documents


TOTAL DOCUMENTS

277
(FIVE YEARS 89)

H-INDEX

33
(FIVE YEARS 5)

Infancy ◽  
2022 ◽  
Author(s):  
Magdalena Krieber‐Tomantschger ◽  
Florian B. Pokorny ◽  
Iris Krieber‐Tomantschger ◽  
Laura Langmann ◽  
Luise Poustka ◽  
...  

2021 ◽  
pp. 204946372110570
Author(s):  
Fleur Baert ◽  
Dimitri Van Ryckeghem ◽  
Alvaro Sanchez-Lopez ◽  
Megan M Miller ◽  
Adam T Hirsh ◽  
...  

Objectives The current study investigated the role of maternal child- and self-oriented injustice appraisals about child pain in understanding maternal attention for child pain and adult anger cues and pain-attending behavior. Methods Forty-four children underwent a painful cold pressor task (CPT) while their mother observed. Eye tracking was used to measure maternal attention to child pain and adult anger cues. Initial attention allocation and attentional maintenance were indexed by probability of first fixation and gaze duration, respectively. Maternal pain-attending behaviors toward the child were videotaped and coded after CPT completion. Mothers also rated the intensity of pain and anger cues used in the free-viewing tasks. All analyses controlled for maternal catastrophizing about child pain. Results Neither child-oriented nor self-oriented injustice was associated with maternal attentional bias toward child pain. Regarding attention toward self-relevant anger cues, differential associations were observed for self- and child-oriented injustice appraisals, with maternal self-oriented injustice being associated with a greater probability of first fixating on anger and with higher anger ratings, whereas maternal child-oriented injustice was associated with enhanced attentional maintenance toward anger. Neither type of maternal injustice appraisals was associated with maternal pain-attending behavior, which was only associated with maternal catastrophizing. Conclusions The current study sheds light on potential differential mechanisms through which maternal self- vs. child-oriented injustice appraisals may exert their impact on parent and child pain-related outcomes. Theoretical implications and future directions are discussed.


2021 ◽  
Vol 15 ◽  
Author(s):  
Samy Rima ◽  
Michael C. Schmid

Small fixational eye-movements are a fundamental aspect of vision and thought to reflect fine shifts in covert attention during active viewing. While the perceptual benefits of these small eye movements have been demonstrated during a wide range of experimental tasks including during free viewing, their function during reading remains surprisingly unclear. Previous research demonstrated that readers with increased microsaccade rates displayed longer reading speeds. To what extent increased fixational eye movements are, however, specific to reading and might be indicative of reading skill deficits remains, however, unknown. To address this topic, we compared the eye movement scan paths of 13 neurotypical individuals and 13 subjects diagnosed with developmental dyslexia during short story reading and free viewing of natural scenes. We found that during reading only, dyslexics tended to display small eye movements more frequently compared to neurotypicals, though this effect was not significant at the population level, as it could also occur in slow readers not diagnosed as dyslexics. In line with previous research, neurotypical readers had twice as many regressive compared to progressive microsaccades, which did not occur during free viewing. In contrast, dyslexics showed similar amounts of regressive and progressive small fixational eye movements during both reading and free viewing. We also found that participants with smaller fixational saccades from both neurotypical and dyslexic samples displayed reduced reading speeds and lower scores during independent tests of reading skill. Slower readers also displayed greater variability in the landing points and temporal occurrence of their fixational saccades. Both the rate and spatio-temporal variability of fixational saccades were associated with lower phonemic awareness scores. As none of the observed differences between dyslexics and neurotypical readers occurred during control experiments with free viewing, the reported effects appear to be directly related to reading. In summary, our results highlight the predictive value of small saccades for reading skill, but not necessarily for developmental dyslexia.


2021 ◽  
Author(s):  
Daria Kvasova ◽  
Travis Stewart ◽  
Salvador Soto-Faraco

In real-world scenes, the different objects and events available to our senses are interconnected within a rich web of semantic associations. These semantic links help parse information and make sense of the environment. For example, during goal-directed attention, characteristic everyday life object sounds help speed up visual search for these objects in natural and dynamic environments. However, it is not known whether semantic correspondences also play a role under spontaneous observation. Here, we investigated this question addressing whether crossmodal semantic congruence can drive spontaneous, overt visual attention in free-viewing conditions. We used eye-tracking whilst participants (N=45) viewed video clips of realistic complex scenes presented alongside sounds of varying semantic congruency with objects within the videos. We found that characteristic sounds increased the probability of looking, the number of fixations, and the total dwell time on the semantically corresponding visual objects, in comparison to when the same scenes were presented with semantically neutral sounds or just with background noise only. Our results suggest that crossmodal semantic congruence has an impact on spontaneous gaze and eye movements, and therefore on how attention samples information in a free viewing paradigm. Our findings extend beyond known effects of object-based crossmodal interactions with simple stimuli and shed new light upon how audio-visual semantically congruent relationships play out in everyday life scenarios.


2021 ◽  
Vol 12 ◽  
Author(s):  
Bastian I. Hougaard ◽  
Hendrik Knoche ◽  
Jim Jensen ◽  
Lars Evald

Purpose: Virtual reality (VR) and eye tracking may provide detailed insights into spatial cognition. We hypothesized that virtual reality and eye tracking may be used to assess sub-types of spatial neglect in stroke patients not readily available from conventional assessments.Method: Eighteen stroke patients with spatial neglect and 16 age and gender matched healthy subjects wearing VR headsets were asked to look around freely in a symmetric 3D museum scene with three pictures. Asymmetry of performance was analyzed to reveal group-level differences and possible neglect sub-types on an individual level.Results: Four out of six VR and eye tracking measures revealed significant differences between patients and controls in this free-viewing task. Gaze-asymmetry between-pictures (including fixation time and count) and head orientation were most sensitive to spatial neglect behavior on a group level analysis. Gaze-asymmetry and head orientation each identified 10 out of 18 (56%), compared to 12 out of 18 (67%) for the best conventional test. Two neglect patients without deviant performance on conventional measures were captured by the VR and eyetracking measures. On the individual level, five stroke patients revealed deviant gaze-asymmetry within-pictures and six patients revealed deviant eye orientation in either direction that were not captured by the group-level analysis.Conclusion: This study is a first step in using VR in combination with eye tracking measures as individual differential neglect subtype diagnostics. This may pave the way for more sensitive and elaborate sub-type diagnostics of spatial neglect that may respond differently to various treatment approaches.


2021 ◽  
Vol 14 (1) ◽  
Author(s):  
Hyoung F. Kim

AbstractOur behavior is often carried out automatically. Automatic behavior can be guided by past experiences, such as learned values associated with objects. Passive-viewing and free-viewing tasks with no immediate outcomes provide a testable condition in which monkeys and humans automatically retrieve value memories and perform habitual searching. Interestingly, in these tasks, caudal regions of the basal ganglia structures are involved in automatic retrieval of learned object values and habitual gaze. In contrast, rostral regions do not participate in these activities but instead monitor the changes in outcomes. These findings indicate that automatic behaviors based on the value memories are processed selectively by the caudal regions of the primate basal ganglia system. Understanding the distinct roles of the caudal basal ganglia may provide insight into finding selective causes of behavioral disorders in basal ganglia disease.


2021 ◽  
Author(s):  
Soo Hyun Park ◽  
Kenji W Koyano ◽  
Brian E Russ ◽  
Elena N Waidmann ◽  
David B.T. McMahon ◽  
...  

During normal vision, our eyes provide the brain with a continuous stream of useful information about the world. How visually specialized areas of the cortex, such as face-selective patches, operate under natural modes of behavior is poorly understood. Here we report that, during the free viewing of videos, cohorts of face-selective neurons in the macaque cortex fractionate into distributed and parallel subnetworks that carry distinct information. We classified neurons into functional groups based on their video-driven coupling with fMRI time courses across the brain. Neurons from each group were distributed across multiple face patches but intermixed locally with other groups at each recording site. These findings challenge prevailing views about functional segregation in the cortex and underscore the importance of naturalistic paradigms for cognitive neuroscience.


2021 ◽  
Vol 15 ◽  
Author(s):  
Antonella Pomè ◽  
Camilla Caponi ◽  
David C. Burr

Perceptual grouping and visual attention are two mechanisms that help to segregate visual input into meaningful objects. Here we report how perceptual grouping, which affects perceived numerosity, is reduced when visual attention is engaged in a concurrent visual task. We asked participants to judge the numerosity of clouds of dot-pairs connected by thin lines, known to cause underestimation of numerosity, while simultaneously performing a color conjunction task. Diverting attention to the concomitant visual distractor significantly reduced the grouping-induced numerosity biases. Moreover, while the magnitude of the illusion under free viewing covaried strongly with AQ-defined autistic traits, under conditions of divided attention the relationship was much reduced. These results suggest that divided attention modulates the perceptual grouping of elements by connectedness and that it is independent of the perceptual style of participants.


Author(s):  
Alexander L. Anwyl-Irvine ◽  
Thomas Armstrong ◽  
Edwin S. Dalmaijer

AbstractPsychological research is increasingly moving online, where web-based studies allow for data collection at scale. Behavioural researchers are well supported by existing tools for participant recruitment, and for building and running experiments with decent timing. However, not all techniques are portable to the Internet: While eye tracking works in tightly controlled lab conditions, webcam-based eye tracking suffers from high attrition and poorer quality due to basic limitations like webcam availability, poor image quality, and reflections on glasses and the cornea. Here we present MouseView.js, an alternative to eye tracking that can be employed in web-based research. Inspired by the visual system, MouseView.js blurs the display to mimic peripheral vision, but allows participants to move a sharp aperture that is roughly the size of the fovea. Like eye gaze, the aperture can be directed to fixate on stimuli of interest. We validated MouseView.js in an online replication (N = 165) of an established free viewing task (N = 83 existing eye-tracking datasets), and in an in-lab direct comparison with eye tracking in the same participants (N = 50). Mouseview.js proved as reliable as gaze, and produced the same pattern of dwell time results. In addition, dwell time differences from MouseView.js and from eye tracking correlated highly, and related to self-report measures in similar ways. The tool is open-source, implemented in JavaScript, and usable as a standalone library, or within Gorilla, jsPsych, and PsychoJS. In sum, MouseView.js is a freely available instrument for attention-tracking that is both reliable and valid, and that can replace eye tracking in certain web-based psychological experiments.


Sign in / Sign up

Export Citation Format

Share Document