scholarly journals Effect of transportation and social isolation on facial expressions of healthy horses

PLoS ONE ◽  
2021 ◽  
Vol 16 (6) ◽  
pp. e0241532
Author(s):  
Johan Lundblad ◽  
Maheen Rashid ◽  
Marie Rhodin ◽  
Pia Haubro Andersen

Horses have the ability to generate a remarkable repertoire of facial expressions, some of which have been linked to the affective component of pain. This study describes the facial expressions in healthy horses free of pain before and during transportation and social isolation, which are putatively stressful but ordinary management procedures. Transportation was performed in 28 horses by subjecting them to short-term road transport in a horse trailer. A subgroup (n = 10) of these horses was also subjected to short-term social isolation. During all procedures, a body-mounted, remote-controlled heart rate monitor provided continuous heart rate measurements. The horses’ heads were video-recorded during the interventions. An exhaustive dataset was generated from the selected video clips of all possible facial action units and action descriptors, time of emergency, duration, and frequency according to the Equine Facial Action Coding System (EquiFACS). Heart rate increased during both interventions (p<0.01), confirming that they caused disruption in sympato-vagal balance. Using the current method for ascribing certain action units (AUs) to specific emotional states in humans and a novel data-driven co-occurrence method, the following facial traits were observed during both interventions: eye white increase (p<0.001), nostril dilator (p<0.001), upper eyelid raiser (p<0.001), inner brow raiser (p = 0.042), tongue show (p<0.001). Increases in ‘ear flicker’ (p<0.001) and blink frequency (p<0.001) were also seen. These facial actions were used to train a machine-learning classifier to discriminate between the high-arousal interventions and calm horses, which achieved at most 79% accuracy. Most facial features identified correspond well with previous findings on behaviors of stressed horses, for example flared nostrils, repetitive mouth behaviors, increased eye white, tongue show, and ear movements. Several features identified in this study of pain-free horses, such as dilated nostrils, eye white increase, and inner brow raiser, are used as indicators of pain in some face-based pain assessment tools. In order to increase performance parameters in pain assessment tools, the relations between facial expressions of stress and pain should be studied further.

2020 ◽  
Author(s):  
Johan Lundblad ◽  
Maheen Rashid ◽  
Marie Rhodin ◽  
Pia Haubro Andersen

AbstractHorses have the ability to generate a remarkable repertoire of facial expressions, some which have been linked to certain emotional states, for example pain. Studies suggest that facial expressions may be a more ‘honest’ expression of emotional state in horses than behavioral or physiological parameters. This study sought to describe the facial expressions during stress of healthy horses free of pain, using a standardized method of recording facial expressions in video. Stress was induced in 28 horses by subjecting them to road transport and 10 of these horses were also subjected to social isolation. The horses served as their own control. A body-mounted, remote controlled heart rate monitor provided continuous heart rate measurements during the interventions. The horses’ facial expressions were video-recorded during the interventions. Frequency and duration of each facial expression were then determined, according to the Equine Facial Action Coding System. Heart rate increased during the stressful interventions (p=0.01), confirming that the interventions were stressful. Using both the human investigation- and the co-occurrence methods, the following facial traits could be observed during stress: eye white increase (p<0.001), nostril dilator (p<0.001), upper eyelid raiser (p<0.001), inner brow raiser (p=0.042), tongue show (p<0.001) along with an increase in ‘ear flicker’ (p<0.001) and blink frequency (p<0.001). The facial actions were successfully used to train a machine-learning classifier to discriminate between stressed and calm horses, with an accuracy of 74.2 %. Most of the facial features identified correspond well with previous research on the subject, for example flared nostrils, repetitive mouth behaviors, increased eye white, tongue show and ear movements. Some features selected as indicative of emotional pain-free stress are used in face-based pain assessment tools, such as dilated nostrils, eye white increase or inner brow raiser. The relation between facial expressions of stress and pain should therefore further be studied.


2021 ◽  
Author(s):  
Wenqiang Guo ◽  
Ziwei Xu ◽  
Zhigao Guo ◽  
Lingling Mao ◽  
Yongyan Hou ◽  
...  

2010 ◽  
Vol 35 (1) ◽  
pp. 1-16 ◽  
Author(s):  
Etienne B. Roesch ◽  
Lucas Tamarit ◽  
Lionel Reveret ◽  
Didier Grandjean ◽  
David Sander ◽  
...  

2018 ◽  
Vol 7 (3.20) ◽  
pp. 284
Author(s):  
Hamimah Ujir ◽  
Irwandi Hipiny ◽  
D N.F. Awang Iskandar

Most works in quantifying facial deformation are based on action units (AUs) provided by the Facial Action Coding System (FACS) which describes facial expressions in terms of forty-six component movements. AU corresponds to the movements of individual facial muscles. This paper presents a rule based approach to classify the AU which depends on certain facial features. This work only covers deformation of facial features based on posed Happy and the Sad expression obtained from the BU-4DFE database. Different studies refer to different combination of AUs that form Happy and Sad expression. According to the FACS rules lined in this work, an AU has more than one facial property that need to be observed. The intensity comparison and analysis on the AUs involved in Sad and Happy expression are presented. Additionally, dynamic analysis for AUs is studied to determine the temporal segment of expressions, i.e. duration of onset, apex and offset time. Our findings show that AU15, for sad expression, and AU12, for happy expression, show facial features deformation consistency for all properties during the expression period. However for AU1 and AU4, their properties’ intensity is different during the expression period. 


2019 ◽  
Vol 3 (2) ◽  
pp. 32 ◽  
Author(s):  
Troy McDaniel ◽  
Diep Tran ◽  
Abhik Chowdhury ◽  
Bijan Fakhri ◽  
Sethuraman Panchanathan

Given that most cues exchanged during a social interaction are nonverbal (e.g., facial expressions, hand gestures, body language), individuals who are blind are at a social disadvantage compared to their sighted peers. Very little work has explored sensory augmentation in the context of social assistive aids for individuals who are blind. The purpose of this study is to explore the following questions related to visual-to-vibrotactile mapping of facial action units (the building blocks of facial expressions): (1) How well can individuals who are blind recognize tactile facial action units compared to those who are sighted? (2) How well can individuals who are blind recognize emotions from tactile facial action units compared to those who are sighted? These questions are explored in a preliminary pilot test using absolute identification tasks in which participants learn and recognize vibrotactile stimulations presented through the Haptic Chair, a custom vibrotactile display embedded on the back of a chair. Study results show that individuals who are blind are able to recognize tactile facial action units as well as those who are sighted. These results hint at the potential for tactile facial action units to augment and expand access to social interactions for individuals who are blind.


Animals ◽  
2019 ◽  
Vol 9 (11) ◽  
pp. 862 ◽  
Author(s):  
Trösch ◽  
Cuzol ◽  
Parias ◽  
Calandreau ◽  
Nowak ◽  
...  

Over the last few years, an increasing number of studies have aimed to gain more insight into the field of animal emotions. In particular, it is of interest to determine whether animals can cross-modally categorize the emotions of others. For domestic animals that share a close relationship with humans, we might wonder whether this cross-modal recognition of emotions extends to humans, as well. In this study, we tested whether horses could recognize human emotions and attribute the emotional valence of visual (facial expression) and vocal (non-verbal vocalization) stimuli to the same perceptual category. Two animated pictures of different facial expressions (anger and joy) were simultaneously presented to the horses, while a speaker played an emotional human non-verbal vocalization matching one of the two facial expressions. Horses looked at the picture that was incongruent with the vocalization more, probably because they were intrigued by the paradoxical combination. Moreover, horses reacted in accordance with the valence of the vocalization, both behaviorally and physiologically (heart rate). These results show that horses can cross-modally recognize human emotions and react emotionally to the emotional states of humans, assessed by non-verbal vocalizations.


Sign in / Sign up

Export Citation Format

Share Document