A multimodal affective computing approach for children companion robots

Author(s):  
Jin Chen ◽  
Yingying She ◽  
Meimei Zheng ◽  
Yang Shu ◽  
Yadong Wang ◽  
...  
2017 ◽  
Vol 10 (2) ◽  
pp. 174-183 ◽  
Author(s):  
Sidney D’Mello ◽  
Arvid Kappas ◽  
Jonathan Gratch

Affective computing (AC) adopts a computational approach to study affect. We highlight the AC approach towards automated affect measures that jointly model machine-readable physiological/behavioral signals with affect estimates as reported by humans or experimentally elicited. We describe the conceptual and computational foundations of the approach followed by two case studies: one on discrimination between genuine and faked expressions of pain in the lab, and the second on measuring nonbasic affect in the wild. We discuss applications of the measures, analyze measurement accuracy and generalizability, and highlight advances afforded by computational tipping points, such as big data, wearable sensing, crowdsourcing, and deep learning. We conclude by advocating for increasing synergies between AC and affective science and offer suggestions toward that direction.


2017 ◽  
Vol 42 (3) ◽  
pp. 771-819 ◽  
Author(s):  
Takashi Yamauchi ◽  
Kunchen Xiao

2011 ◽  
Vol 48 (7) ◽  
pp. 908-922 ◽  
Author(s):  
Vitaliy Kolodyazhniy ◽  
Sylvia D. Kreibig ◽  
James J. Gross ◽  
Walton T. Roth ◽  
Frank H. Wilhelm

2021 ◽  
pp. 205-211
Author(s):  
Laurence Devillers

AbstractThe field of social robotics is fast developing and will have wide implications especially within health care, where much progress has been made towards the development of “companion robots.” Such robots provide therapeutic or monitoring assistance to patients with a range of disabilities over a long timeframe. Preliminary results show that such robots may be particularly beneficial for use with individuals who suffer from neurodegenerative pathologies. Treatment can be accorded around the clock and with a level of patience rarely found among human healthcare workers. Several elements are requisite for the effective deployment of companion robots. They must be able to detect human emotions and in turn mimic human emotional reactions as well as having an outward appearance that corresponds to human expectations about their caregiving role. This chapter presents laboratory findings on AI-systems that enable robots to recognize specific emotions and to adapt their behavior accordingly. Emotional perception by humans (how language and gestures are interpreted by us to grasp the emotional states of others) is being studied as a guide to programming robots so they can simulate emotions in their interactions with humans.


2018 ◽  
Vol 12 (2) ◽  
pp. 6
Author(s):  
SEKHAR PUHAN PRATAP ◽  
BEHERA SUDARSAN ◽  
◽  

Sign in / Sign up

Export Citation Format

Share Document