scholarly journals Imitating Human Emotions with a NAO Robot as Interviewer Playing the Role of Vocational Tutor

Electronics ◽  
2020 ◽  
Vol 9 (6) ◽  
pp. 971
Author(s):  
Selene Goenaga ◽  
Loraine Navarro ◽  
Christian G. Quintero M. ◽  
Mauricio Pardo

This paper proposes an intelligent system that can hold an interview, using a NAO robot as interviewer playing the role of vocational tutor. For that, twenty behaviors within five personality profiles are classified and categorized into NAO. Five basic emotions are considered: anger, boredom, interest, surprise, and joy. Selected behaviors are grouped according to these five different emotions. Common behaviors (e.g., movements or body postures) used by the robot during vocational guidance sessions are based on a theory of personality traits called the “Five-Factor Model”. In this context, a predefined set of questions is asked by the robot—according to a theoretical model called the “Orientation Model”—about the person’s vocational preferences. Therefore, NAO could react as conveniently as possible during the interview, according to the score of the answer given by the person to the question posed and its personality type. Additionally, based on the answers to these questions, a vocational profile is established, and the robot could provide a recommendation about the person’s vocation. The results show how the intelligent selection of behaviors can be successfully achieved through the proposed approach, making the Human–Robot Interaction friendlier.

Author(s):  
Tricia Santamaria ◽  
Dan Nathan-Roberts

This work aims to summarize research methods for measuring personality in human-robot interaction. A systematic review was performed, resulting in 35 studies that were categorized by whether they assessed human personality, robot personality, or both. It was found that the five-factor model (Big Five) of personality (extraversion, agreeableness, conscientiousness, openness, and neuroticism) was a common theme as it was used to assess personality in 31% of studies, and the extraversion dimension alone was used to assess personality in an additional 26% of studies. The strengths, limitations, and the human factors influences of user expectations of the Big Five as well as the recommendation for its use are discussed.


2020 ◽  
Author(s):  
Agnieszka Wykowska ◽  
Jairo Pérez-Osorio ◽  
Stefan Kopp

This booklet is a collection of the position statements accepted for the HRI’20 conference workshop “Social Cognition for HRI: Exploring the relationship between mindreading and social attunement in human-robot interaction” (Wykowska, Perez-Osorio & Kopp, 2020). Unfortunately, due to the rapid unfolding of the novel coronavirus at the beginning of the present year, the conference and consequently our workshop, were canceled. On the light of these events, we decided to put together the positions statements accepted for the workshop. The contributions collected in these pages highlight the role of attribution of mental states to artificial agents in human-robot interaction, and precisely the quality and presence of social attunement mechanisms that are known to make human interaction smooth, efficient, and robust. These papers also accentuate the importance of the multidisciplinary approach to advance the understanding of the factors and the consequences of social interactions with artificial agents.


Author(s):  
Ruth Stock-Homburg

AbstractKnowledge production within the interdisciplinary field of human–robot interaction (HRI) with social robots has accelerated, despite the continued fragmentation of the research domain. Together, these features make it hard to remain at the forefront of research or assess the collective evidence pertaining to specific areas, such as the role of emotions in HRI. This systematic review of state-of-the-art research into humans’ recognition and responses to artificial emotions of social robots during HRI encompasses the years 2000–2020. In accordance with a stimulus–organism–response framework, the review advances robotic psychology by revealing current knowledge about (1) the generation of artificial robotic emotions (stimulus), (2) human recognition of robotic artificial emotions (organism), and (3) human responses to robotic emotions (response), as well as (4) other contingencies that affect emotions as moderators.


2019 ◽  
Vol 33 (2) ◽  
pp. 249-261 ◽  
Author(s):  
Katharina Kolbeck ◽  
Steffen Moritz ◽  
Julia Bierbrodt ◽  
Christina Andreou

Ongoing research is shifting towards a dimensional understanding of borderline personality disorder (BPD). Aim of this study was to identify personality profiles in BPD that are predictive of self-destructive behaviors. Personality traits were assessed (n = 130) according to the five-factor model of personality (i.e., Neuroticism, Extraversion, Openness to Experience, Agreeableness, Conscientiousness) and an additional factor called Risk Preference. Self-destructive behavior parameters such as non-suicidal self-injury (NSSI) and other borderline typical dyscontrolled behaviors (e.g., drug abuse) were assessed by self-report measures. Canonical correlation analyses demonstrated that Neuroticism, Extraversion, and Conscientiousness are predictors of NSSI. Further, Neuroticism, Agreeableness, and Risk Preference were associated with dyscontrolled behaviors. Our results add further support on personality-relevant self-destructive behaviors in BPD. A combined diagnostic assessment could offer clinically meaningful insights about the causes of self-destruction in BPD to expand current therapeutic repertoires.


2014 ◽  
Vol 5 (1) ◽  
pp. 1-11 ◽  
Author(s):  
Mohammad Rabiei ◽  
Alessandro Gasparetto

AbstractA system for recognition of emotions based on speech analysis can have interesting applications in human-robot interaction. In this paper, we carry out an exploratory study on the possibility to use a proposed methodology to recognize basic emotions (sadness, surprise, happiness, anger, fear and disgust) based on phonetic and acoustic properties of emotive speech with the minimal use of signal processing algorithms. We set up an experimental test, consisting of choosing three types of speakers, namely: (i) five adult European speakers, (ii) five Asian (Middle East) adult speakers and (iii) five adult American speakers. The speakers had to repeat 6 sentences in English (with durations typically between 1 s and 3 s) in order to emphasize rising-falling intonation and pitch movement. Intensity, peak and range of pitch and speech rate have been evaluated. The proposed methodology consists of generating and analyzing a graph of formant, pitch and intensity, using the open-source PRAAT program. From the experimental results, it was possible to recognize the basic emotions in most of the cases


Sensors ◽  
2021 ◽  
Vol 21 (19) ◽  
pp. 6438
Author(s):  
Chiara Filippini ◽  
David Perpetuini ◽  
Daniela Cardone ◽  
Arcangelo Merla

An intriguing challenge in the human–robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot’s capability to infer and interpret human emotions. Thanks to its design and open programming platform, the NAO humanoid robot is one of the most widely used agents for human interaction. As with person-to-person communication, facial expressions are the privileged channel for recognizing the interlocutor’s emotional expressions. Although NAO is equipped with a facial expression recognition module, specific use cases may require additional features and affective computing capabilities that are not currently available. This study proposes a highly accurate convolutional-neural-network-based facial expression recognition model that is able to further enhance the NAO robot’ awareness of human facial expressions and provide the robot with an interlocutor’s arousal level detection capability. Indeed, the model tested during human–robot interactions was 91% and 90% accurate in recognizing happy and sad facial expressions, respectively; 75% accurate in recognizing surprised and scared expressions; and less accurate in recognizing neutral and angry expressions. Finally, the model was successfully integrated into the NAO SDK, thus allowing for high-performing facial expression classification with an inference time of 0.34 ± 0.04 s.


Philosophies ◽  
2019 ◽  
Vol 4 (1) ◽  
pp. 11 ◽  
Author(s):  
Frank Förster

In this article, I assess an existing language acquisition architecture, which was deployed in linguistically unconstrained human–robot interaction, together with experimental design decisions with regard to their enactivist credentials. Despite initial scepticism with respect to enactivism’s applicability to the social domain, the introduction of the notion of participatory sense-making in the more recent enactive literature extends the framework’s reach to encompass this domain. With some exceptions, both our architecture and form of experimentation appear to be largely compatible with enactivist tenets. I analyse the architecture and design decisions along the five enactivist core themes of autonomy, embodiment, emergence, sense-making, and experience, and discuss the role of affect due to its central role within our acquisition experiments. In conclusion, I join some enactivists in demanding that interaction is taken seriously as an irreducible and independent subject of scientific investigation, and go further by hypothesising its potential value to machine learning.


2001 ◽  
Vol 29 (4) ◽  
pp. 391-398 ◽  
Author(s):  
Tracy L. Tuten ◽  
Michael Bosnjak

Using the Five-factor model of personality and Need for Cognition, the authors investigated the relationship between personality and Web usage. Of the five factors, Openness to Experience and Neuroticism showed the greatest association to Web usage. Openness to Experience was positively related to using the Web for entertainment and product information, while Neuroticism was negatively related to Web usage. Need for Cognition was significantly and positively correlated with all Web activities involving cognitive thought.


Sign in / Sign up

Export Citation Format

Share Document