Pain detection from facial expressions using domain adaptation technique

Author(s):  
Neeru Rathee ◽  
Sudesh Pahal ◽  
Poonam Sheoran
2020 ◽  
Vol 10 (18) ◽  
pp. 6531
Author(s):  
Mizuho Sumitani ◽  
Michihiro Osumi ◽  
Hiroaki Abe ◽  
Kenji Azuma ◽  
Rikuhei Tsuchida ◽  
...  

People perceive the mind in two dimensions: intellectual and affective. Advances in artificial intelligence enable people to perceive the intellectual mind of a robot through their semantic interactions. Conversely, it has been still controversial whether a robot has an affective mind of its own without any intellectual actions or semantic interactions. We investigated pain experiences when observing three different facial expressions of a virtual agent modeling affective minds (i.e., painful, unhappy, and neutral). The cold pain detection threshold of 19 healthy subjects was measured as they watched a black screen, then changes in their cold pain detection thresholds were evaluated as they watched the facial expressions. Subjects were asked to rate the pain intensity from the respective facial expressions. Changes of cold pain detection thresholds were compared and adjusted by the respective pain intensities. Only when watching the painful expression of a virtual agent did, the cold pain detection threshold increase significantly. By directly evaluating intuitive pain responses when observing facial expressions of a virtual agent, we found that we ‘share’ empathic neural responses, which can be intuitively emerge, according to observed pain intensity with a robot (a virtual agent).


Animals ◽  
2020 ◽  
Vol 10 (11) ◽  
pp. 2155
Author(s):  
Katrina Ask ◽  
Marie Rhodin ◽  
Lena-Mari Tamminen ◽  
Elin Hernlund ◽  
Pia Haubro Andersen

Equine orthopedic pain scales are targeted towards horses with moderate to severe orthopedic pain. Improved assessment of pain behavior and pain-related facial expressions at rest may refine orthopedic pain detection for mild lameness grades. Therefore, this study explored pain-related behaviors and facial expressions and sought to identify frequently occurring combinations. Orthopedic pain was induced by intra-articular LPS in eight horses, and objective movement asymmetry analyses were performed before and after induction together with pain assessments at rest. Three observers independently assessed horses in their box stalls, using four equine pain scales simultaneously. Increase in movement asymmetry after induction was used as a proxy for pain. Behaviors and facial expressions commonly co-occurred and were strongly associated with movement asymmetry. Posture-related scale items were the strongest predictors of movement asymmetry. Display of facial expressions at rest varied between horses but, when present, were strongly associated with movement asymmetry. Reliability of facial expression items was lower than reliability of behavioral items. These findings suggest that five body behaviors (posture, head position, location in the box stall, focus, and interactive behavior) should be included in a scale for live assessment of mild orthopedic pain. We also recommend inclusion of facial expressions in pain assessment.


2016 ◽  
Vol 2016 ◽  
pp. 1-9 ◽  
Author(s):  
Mathieu Grégoire ◽  
Rosée Bruneau-Bhérer ◽  
Karine Morasse ◽  
Fanny Eugène ◽  
Philip L. Jackson

Accurate interpretation of pain expressed by others is important for socialization; however, the development of this skill in children is still poorly understood. Empathy for pain models propose two main components (affective and cognitive), which develop at different stages of life. The study’s objective was to investigate the children’s ability between 3 and 12 years of age to detect and assess the pain intensity in others using visual stimuli depicting either facial expressions of pain or hands in painful contexts. 40 preschool children and 62 school-aged children were recruited. Children observed series of stimuli and evaluated the pain intensity depicted. Results demonstrated that children as young as three years old were able to detect and assess pain in both types of stimuli and this ability continued to improve until the age of 12. Participants demonstrated better detection performance with hands than with faces. Results were coherent with the idea that the two types of stimuli presented recruit different processes. Pain detection in hands appears to rely mostly on affective sharing processes that are effective early in life, while older children’s higher ability to perceive pain in facial expressions suggests that this ability is associated with the gradual development of cognitive processes.


2020 ◽  
Vol 18 (1) ◽  
pp. 125-132

Facial expressions can demonstrate the presence and degree of pain of humans, which is a vital topic in E-healthcare domain specially for elderly people or patients with special needs. This paper presents a framework for pain detection, pain classification, and face recognition using feature extraction, feature selection, and classification techniques. Pain intensity is measured by Prkachin and Solomon pain intensity scale. Experimental results showed that the proposed framework is a promising one compared with previously works. It achieves 91% accuracy in pain detection, 99.89% accuracy in face recognition, and 78%, 92%, 88% accuracy, respectively, for all levels of pain classification


Author(s):  
Reneiro Andal Virrey ◽  
Chandratilak De Silva Liyanage ◽  
Mohammad Iskandar bin Pg Hj Petra ◽  
Pg Emeroylariffion Abas

2020 ◽  
Vol 21 (2) ◽  
pp. 415-422 ◽  
Author(s):  
Minsong Ki ◽  
Yeongwoo Choi

2020 ◽  
Author(s):  
◽  
P. A. S. O. Silva

Pain analysis in newborns has become a relevant study subject over the last few decades, given the inability to objectively identify the source and intensity of the pain in newborn babies. Over the last few years, several methods for pain detection and evaluation were able to classify pain levels using facial expressions from newborn babies, through statistical models, machine learning and deep learning. Considering this context, health professionals are increasingly more interested in having computerized tools at their disposal. These tools would not only be able to accurately rank the newborn’s potential pain level, but also identify the facial regions of greatest relevance for a particular pain phenomenon. This dissertation’s main objective is to develop a computer framework capable of recognizing and interpreting patterns in facial expressions for an automated evaluation of pain levels on term babies. Specifically, this dissertation focuses on the investigation, implementation and integration of a series of techniques, including image detection and segmentation, spacial normalization and, ultimately, the classification of facial expressions based on information obtained through statistical data mining. Finally, the framework developed here, evaluated with an accuracy (upper limit) of approximately 96% for the COPE base and 77% for the UNIFESP base, reveal that it is possible to not only rank pain levels statistically through images of facial expressions, but also to identify key facial regions for certain pain phenomena, therefore assisting in creating more general and accurate pediatric pain scales


2022 ◽  
Vol 2022 ◽  
pp. 1-8
Author(s):  
Stefan Lautenbacher ◽  
Teena Hassan ◽  
Dominik Seuss ◽  
Frederik W. Loy ◽  
Jens-Uwe Garbas ◽  
...  

Introduction. The experience of pain is regularly accompanied by facial expressions. The gold standard for analyzing these facial expressions is the Facial Action Coding System (FACS), which provides so-called action units (AUs) as parametrical indicators of facial muscular activity. Particular combinations of AUs have appeared to be pain-indicative. The manual coding of AUs is, however, too time- and labor-intensive in clinical practice. New developments in automatic facial expression analysis have promised to enable automatic detection of AUs, which might be used for pain detection. Objective. Our aim is to compare manual with automatic AU coding of facial expressions of pain. Methods. FaceReader7 was used for automatic AU detection. We compared the performance of FaceReader7 using videos of 40 participants (20 younger with a mean age of 25.7 years and 20 older with a mean age of 52.1 years) undergoing experimentally induced heat pain to manually coded AUs as gold standard labeling. Percentages of correctly and falsely classified AUs were calculated, and we computed as indicators of congruency, “sensitivity/recall,” “precision,” and “overall agreement (F1).” Results. The automatic coding of AUs only showed poor to moderate outcomes regarding sensitivity/recall, precision, and F1. The congruency was better for younger compared to older faces and was better for pain-indicative AUs compared to other AUs. Conclusion. At the moment, automatic analyses of genuine facial expressions of pain may qualify at best as semiautomatic systems, which require further validation by human observers before they can be used to validly assess facial expressions of pain.


Sign in / Sign up

Export Citation Format

Share Document