Affective State Recognition and Adaptation in Human-Robot Interaction: A Design Approach

Author(s):  
Changchun Liu ◽  
Pramila Rani ◽  
Nilanjan Sarkar
Author(s):  
Anja Lanz ◽  
Elizabeth Croft

The monitoring of human affective state is a key part of developing responsive and naturally behaving human-robot interaction systems. However, evaluation and calibration of physiologically monitored affective state data is typically done using offline questionnaires and user reports. This paper investigates the potential to use an on-line device to collect user self reports that can be then used to calibrate physiologically generated affective state data. The collection of on-line calibration data is particularly germane to human-robot interaction where the physiological responses of interest include those related to more high frequency affective state events related to arousal (surprise, fear, alarm) as well as the more low frequency events (contentment, boredom, pleasure). In this context, this paper describes the development of an experimental device, and a preliminary study, to answer the question: Can people report, on-line, two degree of freedom continuous affective states using a hand held device suitable for calibration of physiologically obtained signals? In the following paper, we report on both the device design and user trials. Further work, using the device to calibrate existing models of the user’s affective state during human-robot interaction, is ongoing and will be reported at the time of the conference.


2021 ◽  
Vol 7 ◽  
Author(s):  
Petra Gemeinboeck

This article lays out the framework for relational-performative aesthetics in human-robot interaction, comprising a theoretical lens and design approach for critical practice-based inquiries into embodied meaning-making in human-robot interaction. I explore the centrality of aesthetics as a practice of embodied meaning-making by drawing on my arts-led, performance-based approach to human-robot encounters, as well as other artistic practices. Understanding social agency and meaning as being enacted through the situated dynamics of the interaction, I bring into focus a process ofbodying-thinging;entangling and transforming subjects and objects in the encounter and rendering elastic boundaries in-between. Rather than serving to make the strange look more familiar, aesthetics here is about rendering the differences between humans and robots more relational. My notion of a relational-performative design approach—designing with bodying-thinging—proposes that we engage with human-robot encounters from the earliest stages of the robot design. This is where we begin to manifest boundaries that shape meaning-making and the potential for emergence, transformation, and connections arising from intra-bodily resonances (bodying-thinging). I argue that this relational-performative approach opens up new possibilities for how we design robots and how they socially participate in the encounter.


Author(s):  
Zhe Zhang ◽  
Goldie Nejat

A new novel breed of robots known as socially assistive robots is emerging. These robots are capable of providing assistance to individuals through social and cognitive interaction. The development of socially assistive robots for health care applications can provide measurable improvements in patient safety, quality of care, and operational efficiencies by playing an increasingly important role in patient care in the fast pace of crowded clinics, hospitals and nursing/veterans homes. However, there are a number of research issues that need to be addressed in order to design such robots. In this paper, we address one main challenge in the development of intelligent socially assistive robots: The robot’s ability to identify, understand and react to human intent and human affective states during assistive interaction. In particular, we present a unique non-contact and non-restricting sensory-based approach for identification and categorization of human body language in determining the affective state of a person during natural real-time human-robot interaction. This classification allows the robot to effectively determine its taskdriven behavior during assistive interaction. Preliminary experiments show the potential of integrating the proposed gesture recognition and classification technique into intelligent socially assistive robotic systems for autonomous interactions with people.


Author(s):  
Petra Gemeinboeck ◽  
Rob Saunders

AbstractCurrent research in human–robot interaction often focuses on rendering communication between humans and robots more ‘natural’ by designing machines that appear and behave humanlike. Communication, in this human-centric approach, is often understood as a process of successfully transmitting information in the form of predefined messages and gestures. This article introduces an alternative arts-led, movement-centric approach, which embraces the differences of machinelike robotic artefacts and, instead, investigates how meaning is dynamically enacted in the encounter of humans and machines. Our design approach revolves around a novel embodied mapping methodology, which serves to bridge between human–machine asymmetries and socioculturally situate abstract robotic artefacts. Building on concepts from performativity, material agency, enactive sense-making and kinaesthetic empathy, our Machine Movement Lab project opens up a performative-relational model of human–machine communication, where meaning is generated through relational dynamics in the interaction itself.


2009 ◽  
Author(s):  
Matthew S. Prewett ◽  
Kristin N. Saboe ◽  
Ryan C. Johnson ◽  
Michael D. Coovert ◽  
Linda R. Elliott

2010 ◽  
Author(s):  
Eleanore Edson ◽  
Judith Lytle ◽  
Thomas McKenna

Sign in / Sign up

Export Citation Format

Share Document