Short Article: Are Attractive Facial Characteristics Peculiar to the Sex of a Face?

2009 ◽  
Vol 62 (5) ◽  
pp. 833-843 ◽  
Author(s):  
Sarah J. Casey ◽  
Marion Mernagh ◽  
Fiona N. Newell

Preferences for faces are thought to be the result of either general adaptations for mate selection, and thus influenced by sexual dimorphism, or mechanisms of general information processing and thus nonspecific to faces. If mate choice determines face preference then it should follow that the sex of a face should affect attractiveness judgements. To test this idea we used image morphing to generate three versions of face images: original, opposite sex, and antiface. First we established that the sex of the face was identifiable in our images. We then collected attractiveness ratings for the three face types. We found that attractiveness ratings to the original faces were correlated with, and did not differ significantly between, ratings to the opposite-sex faces. However, ratings for either the original or opposite face types were not correlated with and were significantly lower than ratings to the antifaces. Our findings failed to support the idea that attractiveness is related to sexual dimorphism in faces alone but suggest instead that other more generic factors influence preferences for all faces.

E-psychologie ◽  
2021 ◽  
Vol 15 (4) ◽  
pp. 84-90
Author(s):  
Lucie Kuncová ◽  
Zuzana Štěrbová ◽  
Jan Havlíček

The aim of this report is to present the research project „Effect of parental characteristics on mate choice“ supported by the Czech Science Foundation (GA18–15168S). It is a multidisciplinary project involving not only psychological but also biological and chemical methods, contributing to a more comprehensive understanding of the studied phenomenon. The main aim of the project is to investigate whether people choose mates similar to their opposite-sex parents in the face, body odor, voice, temperament, and personality.


2021 ◽  
pp. 1-11
Author(s):  
Suphawimon Phawinee ◽  
Jing-Fang Cai ◽  
Zhe-Yu Guo ◽  
Hao-Ze Zheng ◽  
Guan-Chen Chen

Internet of Things is considerably increasing the levels of convenience at homes. The smart door lock is an entry product for smart homes. This work used Raspberry Pi, because of its low cost, as the main control board to apply face recognition technology to a door lock. The installation of the control sensing module with the GPIO expansion function of Raspberry Pi also improved the antitheft mechanism of the door lock. For ease of use, a mobile application (hereafter, app) was developed for users to upload their face images for processing. The app sends the images to Firebase and then the program downloads the images and captures the face as a training set. The face detection system was designed on the basis of machine learning and equipped with a Haar built-in OpenCV graphics recognition program. The system used four training methods: convolutional neural network, VGG-16, VGG-19, and ResNet50. After the training process, the program could recognize the user’s face to open the door lock. A prototype was constructed that could control the door lock and the antitheft system and stream real-time images from the camera to the app.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Takao Fukui ◽  
Mrinmoy Chakrabarty ◽  
Misako Sano ◽  
Ari Tanaka ◽  
Mayuko Suzuki ◽  
...  

AbstractEye movements toward sequentially presented face images with or without gaze cues were recorded to investigate whether those with ASD, in comparison to their typically developing (TD) peers, could prospectively perform the task according to gaze cues. Line-drawn face images were sequentially presented for one second each on a laptop PC display, and the face images shifted from side-to-side and up-and-down. In the gaze cue condition, the gaze of the face image was directed to the position where the next face would be presented. Although the participants with ASD looked less at the eye area of the face image than their TD peers, they could perform comparable smooth gaze shift to the gaze cue of the face image in the gaze cue condition. This appropriate gaze shift in the ASD group was more evident in the second half of trials in than in the first half, as revealed by the mean proportion of fixation time in the eye area to valid gaze data in the early phase (during face image presentation) and the time to first fixation on the eye area. These results suggest that individuals with ASD may benefit from the short-period trial experiment by enhancing the usage of gaze cue.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2003 ◽  
Author(s):  
Xiaoliang Zhu ◽  
Shihao Ye ◽  
Liang Zhao ◽  
Zhicheng Dai

As a sub-challenge of EmotiW (the Emotion Recognition in the Wild challenge), how to improve performance on the AFEW (Acted Facial Expressions in the wild) dataset is a popular benchmark for emotion recognition tasks with various constraints, including uneven illumination, head deflection, and facial posture. In this paper, we propose a convenient facial expression recognition cascade network comprising spatial feature extraction, hybrid attention, and temporal feature extraction. First, in a video sequence, faces in each frame are detected, and the corresponding face ROI (range of interest) is extracted to obtain the face images. Then, the face images in each frame are aligned based on the position information of the facial feature points in the images. Second, the aligned face images are input to the residual neural network to extract the spatial features of facial expressions corresponding to the face images. The spatial features are input to the hybrid attention module to obtain the fusion features of facial expressions. Finally, the fusion features are input in the gate control loop unit to extract the temporal features of facial expressions. The temporal features are input to the fully connected layer to classify and recognize facial expressions. Experiments using the CK+ (the extended Cohn Kanade), Oulu-CASIA (Institute of Automation, Chinese Academy of Sciences) and AFEW datasets obtained recognition accuracy rates of 98.46%, 87.31%, and 53.44%, respectively. This demonstrated that the proposed method achieves not only competitive performance comparable to state-of-the-art methods but also greater than 2% performance improvement on the AFEW dataset, proving the significant outperformance of facial expression recognition in the natural environment.


2011 ◽  
Vol 2011 ◽  
pp. 1-5 ◽  
Author(s):  
Atsushi Hirao

In avian mating systems, male domestic fowls are polygamous and mate with a number of selected members of the opposite sex. The factors that influence mating preference are considered to be visual cues. However, several studies have indicated that chemosensory cues also affect socio-sexual behavior, including mate choice and individual recognition. The female uropygial gland appears to provide odor for mate choice, as uropygial gland secretions are specific to individual body odor. Chicken olfactory bulbs possess efferent projections to the nucleus taeniae that are involved in copulatory behavior. From various reports, it appears that the uropygial gland has the potential to act as the source of social odor cues that dictate mate choice. In this review, evidence for the possible role of the uropygial gland on mate choice in domestic chickens is presented. However, it remains unclear whether a relationship exists between the uropygial gland and major histocompatibility complex-dependent mate choice.


2000 ◽  
Vol 78 (4) ◽  
pp. 613-623 ◽  
Author(s):  
William MR Scully ◽  
M B Fenton ◽  
A SM Saleuddin

Using histological techniques at the light-microscope level, we examined and compared structure and sexual dimorphism of the wing sacs and integumentary glandular scent organs of 11 species of microchiropteran bats. The antebrachial wing sacs of the Neotropical emballonurids Peropteryx macrotis, Saccopteryx bilineata, and Saccopteryx leptura differed in size and location but lacked sudoriferous and sebaceous glands, confirming that they were holding sacs rather than glandular scent organs. Glandular scent organs from 11 species consisted of sebaceous and (or) sudoriferous glands in emballonurids (P. macrotis, S. bilineata, S. leptura, Taphozous melanopogon, Taphozous nudiventris), hipposiderids (Hipposiderous fulvus, Hipposiderous ater), the phyllostomid Sturnira lilium, the vespertilionid Rhogeessa anaeus, and molossids (Molossus ater and Molossus sinaloe). Glandular scent organs were located on the face (H. fulvus, H. ater), gular region (S. bilineata, P. macrotis, T. melanopogon, M. ater, M. sinaloe), chest (T. nudiventris), shoulder (S. lilium), or ears (R. anaeus). Glandular scent organs showed greater similarities within than between families, and typically were rudimentary or lacking in females. Scanning electron microscope examination revealed that the hairs associated with glandular areas of male T. melanopogon were larger and had a different cuticular-scale pattern than body hairs. These were osmetrichia, hairs specialized for holding and dispersing glandular products. In S. lilium, hairs associated with the shoulder scent-gland area were larger than body hairs but similar in cuticular-scale pattern.


2021 ◽  
Vol 5 (3) ◽  
pp. 42-46
Author(s):  
Luiz Eduardo Toledo Avelar

The mandible is the most important bone structure of the facial makeup. Its morphology differs with respect to genetic factors, sexual dimorphism, and age. Among its particular characteristics is the ability to adapt with its counterpart, the base of the skull, conferring a dynamic quality of this bone, by the mechanism of constant remodeling. In order to understand the involvement of the mandible in the evaluation of the lower third of the face, a fractional analysis of its parts is necessary considering morphological parameters of the mandibular angle. The purpose of this study is to demonstrate the importance of the mandible as an instrument in the analysis of the lower third of the face, allowing the accomplishment of aesthetic treatment, respecting the individual characteristics.


2016 ◽  
Vol 12 (1) ◽  
pp. 20150883 ◽  
Author(s):  
Natalia Albuquerque ◽  
Kun Guo ◽  
Anna Wilkinson ◽  
Carine Savalli ◽  
Emma Otta ◽  
...  

The perception of emotional expressions allows animals to evaluate the social intentions and motivations of each other. This usually takes place within species; however, in the case of domestic dogs, it might be advantageous to recognize the emotions of humans as well as other dogs. In this sense, the combination of visual and auditory cues to categorize others' emotions facilitates the information processing and indicates high-level cognitive representations. Using a cross-modal preferential looking paradigm, we presented dogs with either human or dog faces with different emotional valences (happy/playful versus angry/aggressive) paired with a single vocalization from the same individual with either a positive or negative valence or Brownian noise. Dogs looked significantly longer at the face whose expression was congruent to the valence of vocalization, for both conspecifics and heterospecifics, an ability previously known only in humans. These results demonstrate that dogs can extract and integrate bimodal sensory emotional information, and discriminate between positive and negative emotions from both humans and dogs.


Sign in / Sign up

Export Citation Format

Share Document