scholarly journals The face of fear: Effects of eye gaze and emotion on visual attention

2003 ◽  
Vol 10 (7) ◽  
pp. 823-835 ◽  
Author(s):  
Andrew Mathews ◽  
Elaine Fox ◽  
Jenny Yiend ◽  
Andy Calder
2014 ◽  
Vol 23 (3) ◽  
pp. 132-139 ◽  
Author(s):  
Lauren Zubow ◽  
Richard Hurtig

Children with Rett Syndrome (RS) are reported to use multiple modalities to communicate although their intentionality is often questioned (Bartolotta, Zipp, Simpkins, & Glazewski, 2011; Hetzroni & Rubin, 2006; Sigafoos et al., 2000; Sigafoos, Woodyatt, Tuckeer, Roberts-Pennell, & Pittendreigh, 2000). This paper will present results of a study analyzing the unconventional vocalizations of a child with RS. The primary research question addresses the ability of familiar and unfamiliar listeners to interpret unconventional vocalizations as “yes” or “no” responses. This paper will also address the acoustic analysis and perceptual judgments of these vocalizations. Pre-recorded isolated vocalizations of “yes” and “no” were presented to 5 listeners (mother, father, 1 unfamiliar, and 2 familiar clinicians) and the listeners were asked to rate the vocalizations as either “yes” or “no.” The ratings were compared to the original identification made by the child's mother during the face-to-face interaction from which the samples were drawn. Findings of this study suggest, in this case, the child's vocalizations were intentional and could be interpreted by familiar and unfamiliar listeners as either “yes” or “no” without contextual or visual cues. The results suggest that communication partners should be trained to attend to eye-gaze and vocalizations to ensure the child's intended choice is accurately understood.


Author(s):  
Marishetti Niharika

Eye gazing is the fundamental nonverbal interaction that is presently strengthening in emerging technology. This eye blink device facilitates communication among people with disabilities. The process is so simple that it can be done with the eyes blinking on the specific keys built into the virtual keyboard. This type of system may synthesize speech, regulate his environment, and provide a significant boost in self-belief in the individual. Our study emphasises the virtual keyboard, which not only includes integrated alphabetic keys but also contains emergency phrases that may seek help in a variety of scenarios. It can, however, provide voice notification and speech assistance to those who are speech-impaired. To get this, we employed our PC/computer digital Digi-Cam, which is integrated and recognises the face and its elements. As a result, the technique for detecting the face is far less complicated than everything else. The blink of an eye provides an opportunity for a mouse to click on the digital interface. Our goal is to provide nonverbal communication, and as a result, physically impaired people should be able to communicate with the use of a voice assistant. This type of innovation is a blessing for those who have lost their voice or are suffering from paralytic ailments.


2015 ◽  
Vol 9 (4) ◽  
Author(s):  
Songpo Li ◽  
Xiaoli Zhang ◽  
Fernando J. Kim ◽  
Rodrigo Donalisio da Silva ◽  
Diedra Gustafson ◽  
...  

Laparoscopic robots have been widely adopted in modern medical practice. However, explicitly interacting with these robots may increase the physical and cognitive load on the surgeon. An attention-aware robotic laparoscope system has been developed to free the surgeon from the technical limitations of visualization through the laparoscope. This system can implicitly recognize the surgeon's visual attention by interpreting the surgeon's natural eye movements using fuzzy logic and then automatically steer the laparoscope to focus on that viewing target. Experimental results show that this system can make the surgeon–robot interaction more effective, intuitive, and has the potential to make the execution of the surgery smoother and faster.


2019 ◽  
Vol 30 (6) ◽  
pp. 893-906 ◽  
Author(s):  
Zachary Witkower ◽  
Jessica L. Tracy

Research on face perception tends to focus on facial morphology and the activation of facial muscles while ignoring any impact of head position. We raise questions about this approach by demonstrating that head movements can dramatically shift the appearance of the face to shape social judgments without engaging facial musculature. In five studies (total N = 1,517), we found that when eye gaze was directed forward, tilting one’s head downward (compared with a neutral angle) increased perceptions of dominance, and this effect was due to the illusory appearance of lowered and V-shaped eyebrows caused by a downward head tilt. Tilting one’s head downward therefore functions as an action-unit imposter, creating the artificial appearance of a facial action unit that has a strong effect on social perception. Social judgments about faces are therefore driven not only by facial shape and musculature but also by movements in the face’s physical foundation: the head.


2019 ◽  
Vol 29 (10) ◽  
pp. 1441-1451 ◽  
Author(s):  
Melina Nicole Kyranides ◽  
Kostas A. Fanti ◽  
Maria Petridou ◽  
Eva R. Kimonis

AbstractIndividuals with callous-unemotional (CU) traits show deficits in facial emotion recognition. According to preliminary research, this impairment may be due to attentional neglect to peoples’ eyes when evaluating emotionally expressive faces. However, it is unknown whether this atypical processing pattern is unique to established variants of CU traits or modifiable with intervention. This study examined facial affect recognition and gaze patterns among individuals (N = 80; M age = 19.95, SD = 1.01 years; 50% female) with primary vs secondary CU variants. These groups were identified based on repeated measurements of conduct problems, CU traits, and anxiety assessed in adolescence and adulthood. Accuracy and number of fixations on areas of interest (forehead, eyes, and mouth) while viewing six dynamic emotions were assessed. A visual probe was used to direct attention to various parts of the face. Individuals with primary and secondary CU traits were less accurate than controls in recognizing facial expressions across all emotions. Those identified in the low-anxious primary-CU group showed reduced overall fixations to fearful and painful facial expressions compared to those in the high-anxious secondary-CU group. This difference was not specific to a region of the face (i.e. eyes or mouth). Findings point to the importance of investigating both accuracy and eye gaze fixations, since individuals in the primary and secondary groups were only differentiated in the way they attended to specific facial expression. These findings have implications for differentiated interventions focused on improving facial emotion recognition with regard to attending and correctly identifying emotions.


2019 ◽  
Vol 14 (9) ◽  
pp. 967-976 ◽  
Author(s):  
Jellina Prinsen ◽  
Kaat Alaerts

Abstract Previous research has shown a link between eye contact and interpersonal motor resonance, indicating that the mirroring of observed movements is enhanced when accompanied with mutual eye contact between actor and observer. Here, we further explored the role of eye contact within a naturalistic two-person action context. Twenty-two participants observed simple hand movements combined with direct or averted gaze presented via a live model in a two-person setting or via video recordings, while transcranial magnetic stimulation was applied over the primary motor cortex (M1) to measure changes in M1 excitability. Skin conductance responses and gaze behavior were also measured to investigate the role of arousal and visual attention herein. Eye contact significantly enhanced excitability of the observer’s M1 during movement observation within a two-person setting. Notably, participants with higher social responsiveness (Social Communication subscale of the Social Responsiveness Scale) displayed a more pronounced modulation of M1 excitability by eye gaze. Gaze-related modulations in M1 excitability were, however, not associated with differences in visual attention or autonomic arousal. In summary, the current study highlights the effectiveness and feasibility of adopting paradigms with high ecological validity for studying the modulation of mirror system processes by subtle social cues, such as eye gaze.


2019 ◽  
Vol 3 (Fall 2019) ◽  
pp. 105-119
Author(s):  
Britt Erni ◽  
Roland Maurer ◽  
Dirk Kerzel ◽  
Nicolas Burra

The ability to perceive the direction of eye gaze is critical in social settings. Brain lesions in the superior temporal sulcus (STS) impair this ability. We investigated the perception of gaze direction of PS, a patient suffering from acquired prosopagnosia (Rossion et al., 2003). Despite lesions in the face network, the STS was spared in PS. We assessed perception of gaze direction in PS with upright, inverted, and contrast-reversed faces. Compared to the performance of 11 healthy women matched for age and education, PS demonstrated abnormal discrimination of gaze direction with upright and contrast-reversed faces, but not with inverted faces. Our findings suggest that the inability of the patient to process faces holistically weakened her perception of gaze direction, especially in demanding tasks.


Author(s):  
Hideyoshi Yanagisawa ◽  
Kyosuke Tagashira ◽  
Tamotsu Murakami

In the design of kansei (emotional) quality, one of the important issues is to extract causal relations between physical design attributes and the customer’s emotional responses. Without such relations, a designer has to rely on his/her own sense that may be different from the customer’s. In this paper, we propose a new method for extraction of logical rules consisting of combinations of design attributes that explain a customer’s emotional judgment towards product appearance. In the method, we apply a reduct calculation in rough set theory to derive alternatives of causal rules between design attributes and emotional judgments, and use the customer’s eye gaze features for refining the rules. We extract two types of visual attentions (VA), i.e., a single visual attention (SVA) and a combinational visual attention (CVA), by using the proposed gaze features. To demonstrate the effectiveness of the method, we conducted a sensory evaluation experiment using a car-interior design as a case study. In the experiment, multiple participants evaluated impressions of multiple design samples by selecting from a set of words. During the experiment, we recorded the participants’ eye gaze movements as coordinates on a screen, and asked them to vocalize aloud what they were thinking. After an evaluation of each design sample, we conducted a retrospective interview. From the results, we confirmed that the estimated SVA and CVA significantly covered the vocalized thoughts and statements made in the retrospective interview. The estimated VA reduced 53% of the erroneous causal rules and improved the quality of the rules. We found a case where two participants making the same emotional judgments have implicitly different points of view when evaluating the same design sample. Most conventional causality analysis has been unsuccessful in finding such diversity of points of view.


1998 ◽  
Vol 9 (2) ◽  
pp. 131-134 ◽  
Author(s):  
Bruce M. Hood ◽  
J. Douglas Willen ◽  
Jon Driver

Two experiments examined whether infants shift their visual attention in the direction toward which an adult's eyes turn. A computerized modification of previous joint-attention paradigms revealed that infants as young as 3 months attend in the same direction as the eyes of a digitized adult face. This attention shift was indicated by the latency and direction of their orienting to peripheral probes presented after the face was extinguished. A second experiment found a similar influence of direction of perceived gaze, but also that less peripheral orienting occurred if the central face remained visible during presentation of the probe. This may explain why attention shifts triggered by gaze perception have been difficult to observe in infants using previous naturalistic procedures. Our new method reveals both that direction of perceived gaze can be discriminated by young infants and that this perception triggers corresponding shifts of their own attention.


Sign in / Sign up

Export Citation Format

Share Document