Predicting tooth color from facial features and gender: Results from a white elderly cohort

2008 ◽  
Vol 99 (2) ◽  
pp. 101-106 ◽  
Author(s):  
Alexander J. Hassel ◽  
Ina Nitschke ◽  
Jens Dreyhaupt ◽  
Ina Wegener ◽  
Peter Rammelsberg ◽  
...  
2021 ◽  
Author(s):  
Nicole X Han ◽  
Puneeth N. Chakravarthula ◽  
Miguel P. Eckstein

Face processing is a fast and efficient process due to its evolutionary and social importance. A majority of people direct their first eye movement to a featureless point just below the eyes that maximizes accuracy in recognizing a person's identity and gender. Yet, the exact properties or features of the face that guide the first eye movements and reduce fixational variability are unknown. Here, we manipulated the presence of the facial features and the spatial configuration of features to investigate their effect on the location and variability of first and second fixations to peripherally presented faces. Results showed that observers can utilize the face outline, individual facial features, and feature spatial configuration to guide the first eye movements to their preferred point of fixation. The eyes have a preferential role in guiding the first eye movements and reducing fixation variability. Eliminating the eyes or altering their position had the greatest influence on the location and variability of fixations and resulted in the largest detriment to face identification performance. The other internal features (nose and mouth) also contribute to reducing fixation variability. A subsequent experiment measuring detection of single features showed that the eyes have the highest detectability (relative to other features) in the visual periphery providing a strong sensory signal to guide the oculomotor system. Together, the results suggest a flexible multiple-cue approach that might be a robust solution to cope with how the varying eccentricities in the real world influence the ability to resolve individual feature properties and the preferential role of the eyes.


2018 ◽  
Vol 28 (1) ◽  
pp. e96-e102
Author(s):  
Tahir Karaman ◽  
Eyyup Altintas ◽  
Bekir Eser ◽  
Tuba Talo Yildirim ◽  
Faruk Oztekin ◽  
...  

2019 ◽  
Vol 8 (4) ◽  
pp. 6670-6674

Face Recognition is the most important part to identifying people in biometric system. It is the most usable biometric system. This paper focuses on human face recognition by calculating the facial features present in the image and recognizing the person using features. In every face recognition system follows the preprocessing, face detection techniques. In this paper mainly focused on Face detection and gender classification. They are performed in two stages, the first stage is face detection using an enhanced viola jones algorithm and the next stage is gender classification. Input to the video or surveillance that video converted into frames. Select few best frames from the video for detecting the face, before the particular image preprocessed using PSNR. After preprocessing face detection performed, and gender classification comparative analysis done by using a neural network classifier and LBP based classifier


2021 ◽  
Vol 7 ◽  
pp. e804
Author(s):  
Marcos Fernández Carbonell ◽  
Magnus Boman ◽  
Petri Laukka

We investigated emotion classification from brief video recordings from the GEMEP database wherein actors portrayed 18 emotions. Vocal features consisted of acoustic parameters related to frequency, intensity, spectral distribution, and durations. Facial features consisted of facial action units. We first performed a series of person-independent supervised classification experiments. Best performance (AUC = 0.88) was obtained by merging the output from the best unimodal vocal (Elastic Net, AUC = 0.82) and facial (Random Forest, AUC = 0.80) classifiers using a late fusion approach and the product rule method. All 18 emotions were recognized with above-chance recall, although recognition rates varied widely across emotions (e.g., high for amusement, anger, and disgust; and low for shame). Multimodal feature patterns for each emotion are described in terms of the vocal and facial features that contributed most to classifier performance. Next, a series of exploratory unsupervised classification experiments were performed to gain more insight into how emotion expressions are organized. Solutions from traditional clustering techniques were interpreted using decision trees in order to explore which features underlie clustering. Another approach utilized various dimensionality reduction techniques paired with inspection of data visualizations. Unsupervised methods did not cluster stimuli in terms of emotion categories, but several explanatory patterns were observed. Some could be interpreted in terms of valence and arousal, but actor and gender specific aspects also contributed to clustering. Identifying explanatory patterns holds great potential as a meta-heuristic when unsupervised methods are used in complex classification tasks.


SAGE Open ◽  
2020 ◽  
Vol 10 (2) ◽  
pp. 215824402092335
Author(s):  
Rong Shi

Previous research has focused on documenting the perceptual mechanisms of facial expressions of so-called basic emotions; however, little is known about eye movement in terms of recognizing crying expressions. The present study aimed to clarify the visual pattern and the role of face gender in recognizing smiling and crying expressions. Behavioral reactions and fixations duration were recorded, and proportions of fixation counts and viewing time directed at facial features (eyes, nose, and mouth area) were calculated. Results indicated that crying expressions could be processed and recognized faster than that of smiling expressions. Across these expressions, eyes and nose area received more attention than mouth area, but in smiling facial expressions, participants fixated longer on the mouth area. It seems that proportional gaze allocation at facial features was quantitatively modulated by different expressions, but overall gaze distribution was qualitatively similar across crying and smiling facial expressions. Moreover, eye movements showed visual attention was modulated by the gender of faces: Participants looked longer at female faces with smiling expressions relative to male faces. Findings are discussed around the perceptual mechanisms underlying facial expressions recognition and the interaction between gender and expression processing.


2015 ◽  
Vol 26 (2) ◽  
pp. 107-114 ◽  
Author(s):  
Cristina Gómez-Polo ◽  
Javier Montero ◽  
Miguel Gómez-Polo ◽  
Juan Antonio Martínez Vázquez de Parga ◽  
Alicia Celemin-Viñuela

2021 ◽  
Vol 12 ◽  
Author(s):  
Teresa Luther ◽  
Carolin A. Lewis ◽  
Melina Grahlow ◽  
Philippa Hüpen ◽  
Ute Habel ◽  
...  

The categorization of dominant facial features, such as sex, is a highly relevant function for social interaction. It has been found that attributes of the perceiver, such as their biological sex, influence the perception of sexually dimorphic facial features with women showing higher recognition performance for female faces than men. However, evidence on how aspects closely related to biological sex influence face sex categorization are scarce. Using a previously validated set of sex-morphed facial images (morphed from male to female and vice versa), we aimed to investigate the influence of the participant’s gender role identification and sexual orientation on face sex categorization, besides their biological sex. Image ratings, questionnaire data on gender role identification and sexual orientation were collected from 67 adults (34 females). Contrary to previous literature, biological sex per se was not significantly associated with image ratings. However, an influence of participant sexual attraction and gender role identity became apparent: participants identifying with male gender attributes and showing attraction toward females perceived masculinized female faces as more male and femininized male faces as more female when compared to participants identifying with female gender attributes and attraction toward males. Considering that we found these effects in a predominantly cisgender and heterosexual sample, investigation of face sex perception in individuals identifying with a gender different from their assigned sex (i.e., transgender people) might provide further insights into how assigned sex and gender identity are related.


Sign in / Sign up

Export Citation Format

Share Document