Affective State Estimation for Human–Robot Interaction

2007 ◽  
Vol 23 (5) ◽  
pp. 991-1000 ◽  
Author(s):  
Dana Kulic ◽  
Elizabeth A. Croft

Author(s):  
Anja Lanz ◽  
Elizabeth Croft

The monitoring of human affective state is a key part of developing responsive and naturally behaving human-robot interaction systems. However, evaluation and calibration of physiologically monitored affective state data is typically done using offline questionnaires and user reports. This paper investigates the potential to use an on-line device to collect user self reports that can be then used to calibrate physiologically generated affective state data. The collection of on-line calibration data is particularly germane to human-robot interaction where the physiological responses of interest include those related to more high frequency affective state events related to arousal (surprise, fear, alarm) as well as the more low frequency events (contentment, boredom, pleasure). In this context, this paper describes the development of an experimental device, and a preliminary study, to answer the question: Can people report, on-line, two degree of freedom continuous affective states using a hand held device suitable for calibration of physiologically obtained signals? In the following paper, we report on both the device design and user trials. Further work, using the device to calibrate existing models of the user’s affective state during human-robot interaction, is ongoing and will be reported at the time of the conference.



Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 222
Author(s):  
Remko Proesmans ◽  
Andreas Verleysen ◽  
Robbe Vleugels ◽  
Paula Veske ◽  
Victor-Louis De Gusseme ◽  
...  

Smart textiles have found numerous applications ranging from health monitoring to smart homes. Their main allure is their flexibility, which allows for seamless integration of sensing in everyday objects like clothing. The application domain also includes robotics; smart textiles have been used to improve human-robot interaction, to solve the problem of state estimation of soft robots, and for state estimation to enable learning of robotic manipulation of textiles. The latter application provides an alternative to computationally expensive vision-based pipelines and we believe it is the key to accelerate robotic learning of textile manipulation. Current smart textiles, however, maintain wired connections to external units, which impedes robotic manipulation, and lack modularity to facilitate state estimation of large cloths. In this work, we propose an open-source, fully wireless, highly flexible, light, and modular version of a piezoresistive smart textile. Its output stability was experimentally quantified and determined to be sufficient for classification tasks. Its functionality as a state sensor for larger cloths was also verified in a classification task where two of the smart textiles were sewn onto a piece of clothing of which three states are defined. The modular smart textile system was able to recognize these states with average per-class F1-scores ranging from 85.7 to 94.6% with a basic linear classifier.



Author(s):  
Zhe Zhang ◽  
Goldie Nejat

A new novel breed of robots known as socially assistive robots is emerging. These robots are capable of providing assistance to individuals through social and cognitive interaction. The development of socially assistive robots for health care applications can provide measurable improvements in patient safety, quality of care, and operational efficiencies by playing an increasingly important role in patient care in the fast pace of crowded clinics, hospitals and nursing/veterans homes. However, there are a number of research issues that need to be addressed in order to design such robots. In this paper, we address one main challenge in the development of intelligent socially assistive robots: The robot’s ability to identify, understand and react to human intent and human affective states during assistive interaction. In particular, we present a unique non-contact and non-restricting sensory-based approach for identification and categorization of human body language in determining the affective state of a person during natural real-time human-robot interaction. This classification allows the robot to effectively determine its taskdriven behavior during assistive interaction. Preliminary experiments show the potential of integrating the proposed gesture recognition and classification technique into intelligent socially assistive robotic systems for autonomous interactions with people.



2014 ◽  
Author(s):  
Mitchell S. Dunfee ◽  
Tracy Sanders ◽  
Peter A. Hancock


Author(s):  
Rosemarie Yagoda ◽  
Michael D. Coovert


2009 ◽  
Author(s):  
Matthew S. Prewett ◽  
Kristin N. Saboe ◽  
Ryan C. Johnson ◽  
Michael D. Coovert ◽  
Linda R. Elliott


2010 ◽  
Author(s):  
Eleanore Edson ◽  
Judith Lytle ◽  
Thomas McKenna


2020 ◽  
Author(s):  
Agnieszka Wykowska ◽  
Jairo Pérez-Osorio ◽  
Stefan Kopp

This booklet is a collection of the position statements accepted for the HRI’20 conference workshop “Social Cognition for HRI: Exploring the relationship between mindreading and social attunement in human-robot interaction” (Wykowska, Perez-Osorio & Kopp, 2020). Unfortunately, due to the rapid unfolding of the novel coronavirus at the beginning of the present year, the conference and consequently our workshop, were canceled. On the light of these events, we decided to put together the positions statements accepted for the workshop. The contributions collected in these pages highlight the role of attribution of mental states to artificial agents in human-robot interaction, and precisely the quality and presence of social attunement mechanisms that are known to make human interaction smooth, efficient, and robust. These papers also accentuate the importance of the multidisciplinary approach to advance the understanding of the factors and the consequences of social interactions with artificial agents.



Sign in / Sign up

Export Citation Format

Share Document