Real-time pain detection in facial expressions for health robotics

Author(s):  
Laduona Dai ◽  
Joost Broekens ◽  
Khiet P. Truong
Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 3956
Author(s):  
Youngsun Kong ◽  
Hugo F. Posada-Quintero ◽  
Ki H. Chon

The subjectiveness of pain can lead to inaccurate prescribing of pain medication, which can exacerbate drug addiction and overdose. Given that pain is often experienced in patients’ homes, there is an urgent need for ambulatory devices that can quantify pain in real-time. We implemented three time- and frequency-domain electrodermal activity (EDA) indices in our smartphone application that collects EDA signals using a wrist-worn device. We then evaluated our computational algorithms using thermal grill data from ten subjects. The thermal grill delivered a level of pain that was calibrated for each subject to be 8 out of 10 on a visual analog scale (VAS). Furthermore, we simulated the real-time processing of the smartphone application using a dataset pre-collected from another group of fifteen subjects who underwent pain stimulation using electrical pulses, which elicited a VAS pain score level 7 out of 10. All EDA features showed significant difference between painless and pain segments, termed for the 5-s segments before and after each pain stimulus. Random forest showed the highest accuracy in detecting pain, 81.5%, with 78.9% sensitivity and 84.2% specificity with leave-one-subject-out cross-validation approach. Our results show the potential of a smartphone application to provide near real-time objective pain detection.


2020 ◽  
Vol 10 (18) ◽  
pp. 6531
Author(s):  
Mizuho Sumitani ◽  
Michihiro Osumi ◽  
Hiroaki Abe ◽  
Kenji Azuma ◽  
Rikuhei Tsuchida ◽  
...  

People perceive the mind in two dimensions: intellectual and affective. Advances in artificial intelligence enable people to perceive the intellectual mind of a robot through their semantic interactions. Conversely, it has been still controversial whether a robot has an affective mind of its own without any intellectual actions or semantic interactions. We investigated pain experiences when observing three different facial expressions of a virtual agent modeling affective minds (i.e., painful, unhappy, and neutral). The cold pain detection threshold of 19 healthy subjects was measured as they watched a black screen, then changes in their cold pain detection thresholds were evaluated as they watched the facial expressions. Subjects were asked to rate the pain intensity from the respective facial expressions. Changes of cold pain detection thresholds were compared and adjusted by the respective pain intensities. Only when watching the painful expression of a virtual agent did, the cold pain detection threshold increase significantly. By directly evaluating intuitive pain responses when observing facial expressions of a virtual agent, we found that we ‘share’ empathic neural responses, which can be intuitively emerge, according to observed pain intensity with a robot (a virtual agent).


Animals ◽  
2020 ◽  
Vol 10 (11) ◽  
pp. 2155
Author(s):  
Katrina Ask ◽  
Marie Rhodin ◽  
Lena-Mari Tamminen ◽  
Elin Hernlund ◽  
Pia Haubro Andersen

Equine orthopedic pain scales are targeted towards horses with moderate to severe orthopedic pain. Improved assessment of pain behavior and pain-related facial expressions at rest may refine orthopedic pain detection for mild lameness grades. Therefore, this study explored pain-related behaviors and facial expressions and sought to identify frequently occurring combinations. Orthopedic pain was induced by intra-articular LPS in eight horses, and objective movement asymmetry analyses were performed before and after induction together with pain assessments at rest. Three observers independently assessed horses in their box stalls, using four equine pain scales simultaneously. Increase in movement asymmetry after induction was used as a proxy for pain. Behaviors and facial expressions commonly co-occurred and were strongly associated with movement asymmetry. Posture-related scale items were the strongest predictors of movement asymmetry. Display of facial expressions at rest varied between horses but, when present, were strongly associated with movement asymmetry. Reliability of facial expression items was lower than reliability of behavioral items. These findings suggest that five body behaviors (posture, head position, location in the box stall, focus, and interactive behavior) should be included in a scale for live assessment of mild orthopedic pain. We also recommend inclusion of facial expressions in pain assessment.


2008 ◽  
Vol 381-382 ◽  
pp. 375-378
Author(s):  
K.T. Song ◽  
M.J. Han ◽  
F.Y. Chang ◽  
S.H. Chang

The capability of recognizing human facial expression plays an important role in advanced human-robot interaction development. Through recognizing facial expressions, a robot can interact with a user in a more natural and friendly manner. In this paper, we proposed a facial expression recognition system based on an embedded image processing platform to classify different facial expressions on-line in real time. A low-cost embedded vision system has been designed and realized for robotic applications using a CMOS image sensor and digital signal processor (DSP). The current design acquires thirty 640x480 image frames per second (30 fps). The proposed emotion recognition algorithm has been successfully implemented on the real-time vision system. Experimental results on a pet robot show that the robot can interact with a person in a responding manner. The developed image processing platform is effective for accelerating the recognition speed to 25 recognitions per second with an average on-line recognition rate of 74.4% for five facial expressions.


Sign in / Sign up

Export Citation Format

Share Document