The Design of an Intelligent Socially Assistive Robot for Person-Centered Cognitive Interventions

Author(s):  
Jeanie Chan ◽  
Goldie Nejat

Recently, there has been a growing body of research that supports the effectiveness of using non-pharmacological cognitive and social training interventions to reduce the decline of or improve brain functioning in individuals suffering from cognitive impairments. However, implementing and sustaining such interventions on a long-term basis is difficult as they require considerable resources and people, and can be very time-consuming for healthcare staff. The objectives of our research are to validate the effectiveness of these training interventions and make them more accessible to healthcare professionals through the aid of robotic assistants. Our work focuses on designing a human-like socially assistive robot, Brian 2.0, with abilities to recognize and identify human affective intent to determine its own appropriate emotion-based behavior while engaging in natural and believable social interactions with people. In this paper, we present the design of a novel human-robot interaction (HRI) control architecture for Brian 2.0 that allows the robot to provide social and cognitive stimulation in person-centered cognitive interventions. Namely, the novel control architecture is designed to allow a robot to act as a social motivator by encouraging, congratulating and assisting a person during the course of a cognitively stimulating activity. Preliminary experiments validate the robot’s ability to provide assistive interactions during a HRI-based person-directed activity.

2011 ◽  
Vol 08 (01) ◽  
pp. 103-126 ◽  
Author(s):  
JEANIE CHAN ◽  
GOLDIE NEJAT ◽  
JINGCONG CHEN

Recently, there has been a growing body of research that supports the effectiveness of using non-pharmacological cognitive and social training interventions to reduce the decline of or improve brain functioning in individuals suffering from cognitive impairments. However, implementing and sustaining such interventions on a long-term basis is difficult as they require considerable resources and people, and can be very time-consuming for healthcare staff. Our research focuses on making these interventions more accessible to healthcare professionals through the aid of robotic assistants. The objective of our work is to develop an intelligent socially assistive robot with abilities to recognize and identify human affective intent to determine its own appropriate emotion-based behavior while engaging in assistive interactions with people. In this paper, we present the design of a novel human-robot interaction (HRI) control architecture that allows the robot to provide social and cognitive stimulation in person-centered cognitive interventions. Namely, the novel control architecture is designed to allow a robot to act as a social motivator by encouraging, congratulating and assisting a person during the course of a cognitively stimulating activity. Preliminary experiments validate the effectiveness of the control architecture in providing assistive interactions during a HRI-based person-directed activity.


Author(s):  
Goldie Nejat ◽  
Maurizio Ficocelli

The objective of a socially assistive robot is to create a close and effective interaction with a human user for the purpose of giving assistance. In particular, the social interaction, guidance and support that a socially assistive robot can provide a person can be very beneficial to patient-centered care. However, there are a number of conundrums that must be addressed in designing such a robot. This work addresses one of the main limitations in the development of intelligent task-driven socially assistive robots: Robotic control architecture design and implementation with explicit social and assistive task functionalities. In particular, in this paper, a unique emotional behavior module is presented and implemented in a learning-based control architecture for human-robot interactions (HRI). The module is utilized to determine the appropriate emotions of the robot, as motivated by the well-being of the person, during assistive task-driven interactions. A novel online updating technique is used in order to allow the emotional model to adapt to new people and scenarios. Preliminary experiments presented show the effectiveness of utilizing robotic emotional assistive behavior during HRI in assistive scenarios.


2020 ◽  
Vol 12 (1) ◽  
pp. 115-135
Author(s):  
Rachael Bevill Burns ◽  
Hasti Seifi ◽  
Hyosang Lee ◽  
Katherine J. Kuchenbecker

AbstractChildren with autism need innovative solutions that help them learn to master everyday experiences and cope with stressful situations. We propose that socially assistive robot companions could better understand and react to a child’s needs if they utilized tactile sensing. We examined the existing relevant literature to create an initial set of six tactile-perception requirements, and we then evaluated these requirements through interviews with 11 experienced autism specialists from a variety of backgrounds. Thematic analysis of the comments shared by the specialists revealed three overarching themes: the touch-seeking and touch-avoiding behavior of autistic children, their individual differences and customization needs, and the roles that a touch-perceiving robot could play in such interactions. Using the interview study feedback, we refined our initial list into seven qualitative requirements that describe robustness and maintainability, sensing range, feel, gesture identification, spatial, temporal, and adaptation attributes for the touch-perception system of a robot companion for children with autism. Finally, by utilizing the literature and current best practices in tactile sensor development and signal processing, we transformed these qualitative requirements into quantitative specifications. We discuss the implications of these requirements for future human–robot interaction research in the sensing, computing, and user research communities.


2018 ◽  
Vol 49 (1) ◽  
pp. 48-56 ◽  
Author(s):  
Molly K. Crossman ◽  
Alan E. Kazdin ◽  
Elizabeth R. Kitt

2020 ◽  
Author(s):  
Agnieszka Wykowska ◽  
Jairo Pérez-Osorio ◽  
Stefan Kopp

This booklet is a collection of the position statements accepted for the HRI’20 conference workshop “Social Cognition for HRI: Exploring the relationship between mindreading and social attunement in human-robot interaction” (Wykowska, Perez-Osorio & Kopp, 2020). Unfortunately, due to the rapid unfolding of the novel coronavirus at the beginning of the present year, the conference and consequently our workshop, were canceled. On the light of these events, we decided to put together the positions statements accepted for the workshop. The contributions collected in these pages highlight the role of attribution of mental states to artificial agents in human-robot interaction, and precisely the quality and presence of social attunement mechanisms that are known to make human interaction smooth, efficient, and robust. These papers also accentuate the importance of the multidisciplinary approach to advance the understanding of the factors and the consequences of social interactions with artificial agents.


2021 ◽  
Vol 8 ◽  
pp. 205566832110018
Author(s):  
Michael J Sobrepera ◽  
Vera G Lee ◽  
Michelle J Johnson

Introduction We present Lil’Flo, a socially assistive robotic telerehabilitation system for deployment in the community. As shortages in rehabilitation professionals increase, especially in rural areas, there is a growing need to deliver care in the communities where patients live, work, learn, and play. Traditional telepresence, while useful, fails to deliver the rich interactions and data needed for motor rehabilitation and assessment. Methods We designed Lil’Flo, targeted towards pediatric patients with cerebral palsy and brachial plexus injuries using results from prior usability studies. The system combines traditional telepresence and computer vision with a humanoid, who can play games with patients and guide them in a present and engaging way under the supervision of a remote clinician. We surveyed 13 rehabilitation clinicians in a virtual usability test to evaluate the system. Results The system is more portable, extensible, and cheaper than our prior iteration, with an expressive humanoid. The virtual usability testing shows that clinicians believe Lil’Flo could be deployed in rural and elder care facilities and is more capable of remote stretching, strength building, and motor assessments than traditional video only telepresence. Conclusions Lil’Flo represents a novel approach to delivering rehabilitation care in the community while maintaining the clinician-patient connection.


Sign in / Sign up

Export Citation Format

Share Document