Perception of eye gaze direction in a case of acquired prosopagnosia

2019 ◽  
Vol 3 (Fall 2019) ◽  
pp. 105-119
Author(s):  
Britt Erni ◽  
Roland Maurer ◽  
Dirk Kerzel ◽  
Nicolas Burra

The ability to perceive the direction of eye gaze is critical in social settings. Brain lesions in the superior temporal sulcus (STS) impair this ability. We investigated the perception of gaze direction of PS, a patient suffering from acquired prosopagnosia (Rossion et al., 2003). Despite lesions in the face network, the STS was spared in PS. We assessed perception of gaze direction in PS with upright, inverted, and contrast-reversed faces. Compared to the performance of 11 healthy women matched for age and education, PS demonstrated abnormal discrimination of gaze direction with upright and contrast-reversed faces, but not with inverted faces. Our findings suggest that the inability of the patient to process faces holistically weakened her perception of gaze direction, especially in demanding tasks.

2005 ◽  
Vol 94 (2) ◽  
pp. 1252-1266 ◽  
Author(s):  
Wania C. De Souza ◽  
Satoshi Eifuku ◽  
Ryoi Tamura ◽  
Hisao Nishijo ◽  
Taketoshi Ono

The anterior superior temporal sulcus (STS) of macaque monkeys is thought to be involved in the analysis of incoming perceptual information for face recognition or identification; face neurons in the anterior STS show tuning to facial views and/or gaze direction in the faces of others. Although it is well known that both the anatomical architecture and the connectivity differ between the rostral and caudal regions of the anterior STS, the functional heterogeneity of these regions is not well understood. We recorded the activity of face neurons in the anterior STS of macaque monkeys during the performance of a face identification task, and we compared the characteristics of face neuron responses in the caudal and rostral regions of the anterior STS. In the caudal region, facial views that elicited optimal responses were distributed among all views tested; the majority of face neurons responded symmetrically to right and left views. In contrast, the face neurons in the rostral region responded optimally to a single oblique view; right-left symmetry among the responses of these neurons was less evident. Modulation of the face neuron responses according to gaze direction was more evident in the rostral region. Some of the face neuron responses were specific to a certain combination of a particular facial view and a particular gaze direction, whereas others were associated with the relative spatial relationship between facial view and gaze direction. Taken together, these results indicated the existence of a functional heterogeneity within the anterior STS and suggested a plausible hierarchical organization of facial information processing.


The direction of eye gaze and orientation of the face towards or away from another are important social signals for man and for macaque monkey. We have studied the effects of these signals in a region of the macaque temporal cortex where cells have been found to be responsive to the sight of faces. Of cells selectively responsive to the sight of the face or head but not to other objects (182 cells) 63% were sensitive to the orientation of the head. Different views of the head (full face, profile, back or top of the head, face rotated by 45° up to the ceiling or down to the floor) maximally activated different classes of cell. All classes of cell, however, remained active as the preferred view was rotated isomorphically or was changed in size or distance. Isomorphic rotation by 90–180° increased cell response latencies by 10–60 ms. Sensitivity to gaze direction was found for 64% of the cells tested that were tuned to head orientation. Eighteen cells most responsive to the full face preferred eye contact, while 18 cells tuned to the profile face preferred averted gaze. Sensitivity to gaze was thus compatible with, but could be independent of, sensitivity to head orientation. Results suggest that the recognition of one type of object may proceed via the independent high level analysis of several restricted views of the object (viewer-centred descriptions).


Perception ◽  
2020 ◽  
Vol 49 (3) ◽  
pp. 330-356
Author(s):  
Pik Ki Ho ◽  
Fiona N. Newell

We investigated whether the perceived attractiveness of expressive faces was influenced by head turn and eye gaze towards or away from the observer. In all experiments, happy faces were consistently rated as more attractive than angry faces. A head turn towards the observer, whereby a full-face view was shown, was associated with relatively higher attractiveness ratings when gaze direction was aligned with face view (Experiment 1). However, preference for full-face views of happy faces was not affected by gaze shifts towards or away from the observer (Experiment 2a). In Experiment 3, the relative duration of each face view (front-facing or averted at 15°) during a head turn away or towards the observer was manipulated. There was benefit on attractiveness ratings for happy faces shown for a longer duration from the front view, regardless of the direction of head turn. Our findings support previous studies indicating a preference for positive expressions on attractiveness judgements, which is further enhanced by the front views of faces, whether presented during a head turn or shown statically. In sum, our findings imply a complex interaction between cues of social attention, indicated by the view of the face shown, and reward on attractiveness judgements of unfamiliar faces.


1999 ◽  
Vol 42 (3) ◽  
pp. 526-539 ◽  
Author(s):  
Charissa R. Lansing ◽  
George W. McConkie

Two experiments were conducted to test the hypothesis that visual information related to segmental versus prosodic aspects of speech is distributed differently on the face of the talker. In the first experiment, eye gaze was monitored for 12 observers with normal hearing. Participants made decisions about segmental and prosodic categories for utterances presented without sound. The first experiment found that observers spend more time looking at and direct more gazes toward the upper part of the talker's face in making decisions about intonation patterns than about the words being spoken. The second experiment tested the Gaze Direction Assumption underlying Experiment 1—that is, that people direct their gaze to the stimulus region containing information required for their task. In this experiment, 18 observers with normal hearing made decisions about segmental and prosodic categories under conditions in which face motion was restricted to selected areas of the face. The results indicate that information in the upper part of the talker's face is more critical for intonation pattern decisions than for decisions about word segments or primary sentence stress, thus supporting the Gaze Direction Assumption. Visual speech perception proficiency requires learning where to direct visual attention for cues related to different aspects of speech.


2021 ◽  
Author(s):  
Guillaume Lio ◽  
Martina Corazzol ◽  
Roberta Fadda ◽  
Giuseppe Doneddu ◽  
Caroline Demily ◽  
...  

Attention to faces and eye contact are key behaviors for establishing social bonds in humans. Autism Spectrum Disorders (ASD) is characterized by poor communication skills, impaired face processing and gaze avoidance. The biological alterations underlying these impairments are still unclear. Using electroencephalography, multi-variate pattern classification and blind source separation methods we searched for face and face components related neural signals that could best discriminate neurotypicals and ASD visual processing. We isolated a face-specific neural signal in the superior temporal sulcus peaking at 240ms after stimulus onset. A machine learning algorithm applied on the extracted neural component reached 74% decoding accuracy at the same latencies, dissociating neurotypicals from ASD subjects in whom this signal was weak. By manipulating attention to face parts we found that the signal-evoked power in neurotypicals varied as a function of the distance of the eyes in the face stimulus with respect to the viewers' fovea. Such selective face and face-components neural modulations were not found in ASD individuals although they showed early face related P100 and N170 signals. These findings show that dedicated cortical mechanisms related to face perception set neural priority for attention to eyes and that these mechanisms are altered in individuals with ASD.


2014 ◽  
Vol 23 (3) ◽  
pp. 132-139 ◽  
Author(s):  
Lauren Zubow ◽  
Richard Hurtig

Children with Rett Syndrome (RS) are reported to use multiple modalities to communicate although their intentionality is often questioned (Bartolotta, Zipp, Simpkins, & Glazewski, 2011; Hetzroni & Rubin, 2006; Sigafoos et al., 2000; Sigafoos, Woodyatt, Tuckeer, Roberts-Pennell, & Pittendreigh, 2000). This paper will present results of a study analyzing the unconventional vocalizations of a child with RS. The primary research question addresses the ability of familiar and unfamiliar listeners to interpret unconventional vocalizations as “yes” or “no” responses. This paper will also address the acoustic analysis and perceptual judgments of these vocalizations. Pre-recorded isolated vocalizations of “yes” and “no” were presented to 5 listeners (mother, father, 1 unfamiliar, and 2 familiar clinicians) and the listeners were asked to rate the vocalizations as either “yes” or “no.” The ratings were compared to the original identification made by the child's mother during the face-to-face interaction from which the samples were drawn. Findings of this study suggest, in this case, the child's vocalizations were intentional and could be interpreted by familiar and unfamiliar listeners as either “yes” or “no” without contextual or visual cues. The results suggest that communication partners should be trained to attend to eye-gaze and vocalizations to ensure the child's intended choice is accurately understood.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


Author(s):  
Marishetti Niharika

Eye gazing is the fundamental nonverbal interaction that is presently strengthening in emerging technology. This eye blink device facilitates communication among people with disabilities. The process is so simple that it can be done with the eyes blinking on the specific keys built into the virtual keyboard. This type of system may synthesize speech, regulate his environment, and provide a significant boost in self-belief in the individual. Our study emphasises the virtual keyboard, which not only includes integrated alphabetic keys but also contains emergency phrases that may seek help in a variety of scenarios. It can, however, provide voice notification and speech assistance to those who are speech-impaired. To get this, we employed our PC/computer digital Digi-Cam, which is integrated and recognises the face and its elements. As a result, the technique for detecting the face is far less complicated than everything else. The blink of an eye provides an opportunity for a mouse to click on the digital interface. Our goal is to provide nonverbal communication, and as a result, physically impaired people should be able to communicate with the use of a voice assistant. This type of innovation is a blessing for those who have lost their voice or are suffering from paralytic ailments.


2021 ◽  
Author(s):  
Fumihiro Kano ◽  
Takeshi Furuichi ◽  
Chie Hashimoto ◽  
Christopher Krupenye ◽  
Jesse G Leinwand ◽  
...  

The gaze-signaling hypothesis and the related cooperative-eye hypothesis posit that humans have evolved special external eye morphology, including exposed white sclera (the white of the eye), to enhance the visibility of eye-gaze direction and thereby facilitate conspecific communication through joint-attentional interaction and ostensive communication. However, recent quantitative studies questioned these hypotheses based on new findings that humans are not necessarily unique in certain eye features compared to other great ape species. Therefore, there is currently a heated debate on whether external eye features of humans are distinguished from those of other apes and how such distinguished features contribute to the visibility of eye-gaze direction. This study leveraged updated image analysis techniques to test the uniqueness of human eye features in facial images of great apes. Although many eye features were similar between humans and other species, a key difference was that humans have uniformly white sclera which creates clear visibility of both eye outline and iris; the two essential features contributing to the visibility of eye-gaze direction. We then tested the robustness of the visibility of these features against visual noises such as darkening and distancing and found that both eye features remain detectable in the human eye, while eye outline becomes barely detectable in other species under these visually challenging conditions. Overall, we identified that humans have distinguished external eye morphology among other great apes, which ensures robustness of eye-gaze signal against various visual conditions. Our results support and also critically update the central premises of the gaze-signaling hypothesis.


Sign in / Sign up

Export Citation Format

Share Document