Disgust Proneness and the Perception of Disgust-Evoking Pictures

2016 ◽  
Vol 30 (3) ◽  
pp. 124-129 ◽  
Author(s):  
Anne Schienle ◽  
Sonja Übel ◽  
Andreas Gremsl ◽  
Florian Schöngassner ◽  
Christof Körner

Abstract. Disgust has been conceptualized as an emotion which promotes disease-avoidance behavior. The present eye-tracking experiment investigated whether disgust-evoking stimuli provoke specific eye movements and pupillary responses. Forty-three women viewed images depicting disgusting, fear-eliciting, neutral items and fractals while their eye movements (fixation duration and frequency, blinking rate, saccade amplitude) and pupil size were recorded. Disgust and fear ratings for the pictures as well as trait disgust and trait anxiety were assessed. The disgust pictures evoked the target emotion specifically and prompted characteristic scanning patterns. The participants made more and shorter fixations when looking at the repulsive pictures compared to all other categories. Moreover, state and trait disgust of the participants correlated negatively with their pupil size during disgust elicitation. Our data point to a disgust-specific visual exploration behavior, which possibly supports the fast identification of health-threatening aspects of a stimulus.

2012 ◽  
Vol 5 (4) ◽  
Author(s):  
Antoine Coutrot ◽  
Nathalie Guyader ◽  
Gelu Ionescu ◽  
Alice Caplier

Models of visual attention rely on visual features such as orientation, intensity or motion to predict which regions of complex scenes attract the gaze of observers. So far, sound has never been considered as a possible feature that might influence eye movements. Here, we evaluate the impact of non-spatial sound on the eye movements of observers watching videos. We recorded eye movements of 40 participants watching assorted videos with and without their related soundtracks. We found that sound impacts on eye position, fixation duration and saccade amplitude. The effect of sound is not constant across time but becomes significant around one second after the beginning of video shots.


2021 ◽  
Author(s):  
Franziska Regnath ◽  
Sebastiaan Mathôt

AbstractThe adaptive gain theory (AGT) posits that activity in the locus coeruleus (LC) is linked to two behavioral modes: exploitation, characterized by focused attention on a single task; and exploration, characterized by a lack of focused attention and frequent switching between tasks. Furthermore, pupil size correlates with LC activity, such that large pupils indicate increased LC firing, and by extension also exploration behavior. Most evidence for this correlation in humans comes from complex behavior in game-like tasks. However, predictions of the AGT naturally extend to a very basic form of behavior: eye movements. To test this, we used a visual-search task. Participants searched for a target among many distractors, while we measured their pupil diameter and eye movements. The display was divided into four randomly generated regions of different colors. Although these regions were irrelevant to the task, participants were sensitive to their boundaries, and dwelled within regions for longer than expected by chance. Crucially, pupil size increased before eye movements that carried gaze from one region to another. We propose that eye movements that stay within regions (or objects) correspond to exploitation behavior, whereas eye movements that switch between regions (or objects) correspond to exploration behavior.Public Significance StatementWhen people experience increased arousal, their pupils dilate. The adaptive-gain theory proposes that pupil size reflects neural activity in the locus coeruleus (LC), which in turn is associated with two behavioral modes: a vigilant, distractible mode (“exploration”), and a calm, focused mode (“exploitation”). During exploration, pupils are larger and LC activity is higher than during exploitation. Here we show that the predictions of this theory generalize to eye movements: smaller pupils coincide with eye movements indicative of exploitation, while pupils slightly dilate just before make eye movements that are indicative of exploration.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jinxia Wang ◽  
Xiaoying Sun ◽  
Jiachen Lu ◽  
HaoRan Dou ◽  
Yi Lei

AbstractPrevious research indicates that excessive fear is a critical feature in anxiety disorders; however, recent studies suggest that disgust may also contribute to the etiology and maintenance of some anxiety disorders. It remains unclear if differences exist between these two threat-related emotions in conditioning and generalization. Evaluating different patterns of fear and disgust learning would facilitate a deeper understanding of how anxiety disorders develop. In this study, 32 college students completed threat conditioning tasks, including conditioned stimuli paired with frightening or disgusting images. Fear and disgust were divided into two randomly ordered blocks to examine differences by recording subjective US expectancy ratings and eye movements in the conditioning and generalization process. During conditioning, differing US expectancy ratings (fear vs. disgust) were found only on CS-, which may demonstrated that fear is associated with inferior discrimination learning. During the generalization test, participants exhibited greater US expectancy ratings to fear-related GS1 (generalized stimulus) and GS2 relative to disgust GS1 and GS2. Fear led to longer reaction times than disgust in both phases, and the pupil size and fixation duration for fear stimuli were larger than for disgust stimuli, suggesting that disgust generalization has a steeper gradient than fear generalization. These findings provide preliminary evidence for differences between fear- and disgust-related stimuli in conditioning and generalization, and suggest insights into treatment for anxiety and other fear- or disgust-related disorders.


2021 ◽  
Vol 13 (13) ◽  
pp. 7463
Author(s):  
Amin Azimian ◽  
Carlos Alberto Catalina Ortega ◽  
Juan Maria Espinosa ◽  
Miguel Ángel Mariscal ◽  
Susana García-Herrero

Roundabouts are considered as one of the most efficient forms of intersection that substantially reduce the types of crashes that result in injury or loss of life. Nevertheless, they do not eliminate collision risks, especially when human error plays such a large role in traffic crashes. In this study, we used a driving simulator and an eye tracker to investigate drivers’ eye movements under cell phone-induced distraction. A total of 45 drivers participated in two experiments conducted under distracted and non-distracted conditions. The results indicated that, under distracting conditions, the drivers’ fixation duration decreased significantly on roundabouts, and pupil size increased significantly.


2020 ◽  
Vol 10 (5) ◽  
pp. 92
Author(s):  
Ramtin Zargari Marandi ◽  
Camilla Ann Fjelsted ◽  
Iris Hrustanovic ◽  
Rikke Dan Olesen ◽  
Parisa Gazerani

The affective dimension of pain contributes to pain perception. Cognitive load may influence pain-related feelings. Eye tracking has proven useful for detecting cognitive load effects objectively by using relevant eye movement characteristics. In this study, we investigated whether eye movement characteristics differ in response to pain-related feelings in the presence of low and high cognitive loads. A set of validated, control, and pain-related sounds were applied to provoke pain-related feelings. Twelve healthy young participants (six females) performed a cognitive task at two load levels, once with the control and once with pain-related sounds in a randomized order. During the tasks, eye movements and task performance were recorded. Afterwards, the participants were asked to fill out questionnaires on their pain perception in response to the applied cognitive loads. Our findings indicate that an increased cognitive load was associated with a decreased saccade peak velocity, saccade frequency, and fixation frequency, as well as an increased fixation duration and pupil dilation range. Among the oculometrics, pain-related feelings were reflected only in the pupillary responses to a low cognitive load. The performance and perceived cognitive load decreased and increased, respectively, with the task load level and were not influenced by the pain-related sounds. Pain-related feelings were lower when performing the task compared with when no task was being performed in an independent group of participants. This might be due to the cognitive engagement during the task. This study demonstrated that cognitive processing could moderate the feelings associated with pain perception.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Isabell Hubert Lyall ◽  
Juhani Järvikivi

AbstractResearch suggests that listeners’ comprehension of spoken language is concurrently affected by linguistic and non-linguistic factors, including individual difference factors. However, there is no systematic research on whether general personality traits affect language processing. We correlated 88 native English-speaking participants’ Big-5 traits with their pupillary responses to spoken sentences that included grammatical errors, "He frequently have burgers for dinner"; semantic anomalies, "Dogs sometimes chase teas"; and statements incongruent with gender stereotyped expectations, such as "I sometimes buy my bras at Hudson's Bay", spoken by a male speaker. Generalized additive mixed models showed that the listener's Openness, Extraversion, Agreeableness, and Neuroticism traits modulated resource allocation to the three different types of unexpected stimuli. No personality trait affected changes in pupil size across the board: less open participants showed greater pupil dilation when processing sentences with grammatical errors; and more introverted listeners showed greater pupil dilation in response to both semantic anomalies and socio-cultural clashes. Our study is the first one demonstrating that personality traits systematically modulate listeners’ online language processing. Our results suggest that individuals with different personality profiles exhibit different patterns of the allocation of cognitive resources during real-time language comprehension.


1991 ◽  
Vol 6 (1) ◽  
pp. 3-13 ◽  
Author(s):  
James T. McIlwain

AbstractThis paper reviews evidence that the superior colliculus (SC) of the midbrain represents visual direction and certain aspects of saccadic eye movements in the distribution of activity across a population of cells. Accurate and precise eye movements appear to be mediated, in part at least, by cells of the SC that have large sensory receptive fields and/or discharge in association with a range of saccades. This implies that visual points or saccade targets are represented by patches rather than points of activity in the SC. Perturbation of the pattern of collicular discharge by focal inactivation modifies saccade amplitude and direction in a way consistent with distributed coding. Several models have been advanced to explain how such a code might be implemented in the colliculus. Evidence related to these hypotheses is examined and continuing uncertainties are identified.


2021 ◽  
Vol 12 ◽  
Author(s):  
Jorge Oliveira ◽  
Marta Fernandes ◽  
Pedro J. Rosa ◽  
Pedro Gamito

Research on pupillometry provides an increasing evidence for associations between pupil activity and memory processing. The most consistent finding is related to an increase in pupil size for old items compared with novel items, suggesting that pupil activity is associated with the strength of memory signal. However, the time course of these changes is not completely known, specifically, when items are presented in a running recognition task maximizing interference by requiring the recognition of the most recent items from a sequence of old/new items. The sample comprised 42 healthy participants who performed a visual word recognition task under varying conditions of retention interval. Recognition responses were evaluated using behavioral variables for discrimination accuracy, reaction time, and confidence in recognition decisions. Pupil activity was recorded continuously during the entire experiment. The results suggest a decrease in recognition performance with increasing study-test retention interval. Pupil size decreased across retention intervals, while pupil old/new effects were found only for words recognized at the shortest retention interval. Pupillary responses consisted of a pronounced early pupil constriction at retrieval under longer study-test lags corresponding to weaker memory signals. However, the pupil size was also sensitive to the subjective feeling of familiarity as shown by pupil dilation to false alarms (new items judged as old). These results suggest that the pupil size is related not only to the strength of memory signal but also to subjective familiarity decisions in a continuous recognition memory paradigm.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245777
Author(s):  
Fanny Poncet ◽  
Robert Soussignan ◽  
Margaux Jaffiol ◽  
Baptiste Gaudelus ◽  
Arnaud Leleu ◽  
...  

Recognizing facial expressions of emotions is a fundamental ability for adaptation to the social environment. To date, it remains unclear whether the spatial distribution of eye movements predicts accurate recognition or, on the contrary, confusion in the recognition of facial emotions. In the present study, we asked participants to recognize facial emotions while monitoring their gaze behavior using eye-tracking technology. In Experiment 1a, 40 participants (20 women) performed a classic facial emotion recognition task with a 5-choice procedure (anger, disgust, fear, happiness, sadness). In Experiment 1b, a second group of 40 participants (20 women) was exposed to the same materials and procedure except that they were instructed to say whether (i.e., Yes/No response) the face expressed a specific emotion (e.g., anger), with the five emotion categories tested in distinct blocks. In Experiment 2, two groups of 32 participants performed the same task as in Experiment 1a while exposed to partial facial expressions composed of actions units (AUs) present or absent in some parts of the face (top, middle, or bottom). The coding of the AUs produced by the models showed complex facial configurations for most emotional expressions, with several AUs in common. Eye-tracking data indicated that relevant facial actions were actively gazed at by the decoders during both accurate recognition and errors. False recognition was mainly associated with the additional visual exploration of less relevant facial actions in regions containing ambiguous AUs or AUs relevant to other emotional expressions. Finally, the recognition of facial emotions from partial expressions showed that no single facial actions were necessary to effectively communicate an emotional state. In contrast, the recognition of facial emotions relied on the integration of a complex set of facial cues.


2019 ◽  
Vol 19 (10) ◽  
pp. 252c
Author(s):  
Sebastiaan Mathôt ◽  
Adina Wagner ◽  
Michael Hanke

Sign in / Sign up

Export Citation Format

Share Document