scholarly journals Expressing Personality Through Non-verbal Behaviour in Real-Time Interaction

2021 ◽  
Vol 12 ◽  
Author(s):  
Maryam Saberi ◽  
Steve DiPaola ◽  
Ulysses Bernardet

The attribution of traits plays an important role as a heuristic for how we interact with others. Many psychological models of personality are analytical in that they derive a classification from reported or hypothesised behaviour. In the work presented here, we follow the opposite approach: Our personality model generates behaviour that leads an observer to attribute personality characteristics to the actor. Concretely, the model controls all relevant aspects of non-verbal behaviour such as gaze, facial expression, gesture, and posture. The model, embodied in a virtual human, affords to realistically interact with participants in real-time. Conceptually, our model focuses on the two dimensions of extra/introversion and stability/neuroticism. In the model, personality parameters influence both, the internal affective state as well as the characteristic of the behaviour execution. Importantly, the parameters of the model are based on empirical findings in the behavioural sciences. To evaluate our model, we conducted two types of studies. Firstly, passive experiments where participants rated videos showing variants of behaviour driven by different personality parameter configurations. Secondly, presential experiments where participants interacted with the virtual human, playing rounds of the Rock-Paper-Scissors game. Our results show that the model is effective in conveying the impression of the personality of a virtual character to users. Embodying the model in an artificial social agent capable of real-time interactive behaviour is the only way to move from an analytical to a generative approach to understanding personality, and we believe that this methodology raises a host of novel research questions in the field of personality theory.

2021 ◽  
Vol 11 (11) ◽  
pp. 5067
Author(s):  
Paulo Veloso Gomes ◽  
António Marques ◽  
João Donga ◽  
Catarina Sá ◽  
António Correia ◽  
...  

The interactivity of an immersive environment comes up from the relationship that is established between the user and the system. This relationship results in a set of data exchanges between human and technological actors. The real-time biofeedback devices allow to collect in real time the biodata generated by the user during the exhibition. The analysis, processing and conversion of these biodata into multimodal data allows to relate the stimuli with the emotions they trigger. This work describes an adaptive model for biofeedback data flows management used in the design of interactive immersive systems. The use of an affective algorithm allows to identify the types of emotions felt by the user and the respective intensities. The mapping between stimuli and emotions creates a set of biodata that can be used as elements of interaction that will readjust the stimuli generated by the system. The real-time interaction generated by the evolution of the user’s emotional state and the stimuli generated by the system allows him to adapt attitudes and behaviors to the situations he faces.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Thomas Treal ◽  
Philip L. Jackson ◽  
Jean Jeuvrey ◽  
Nicolas Vignais ◽  
Aurore Meugnot

AbstractVirtual reality platforms producing interactive and highly realistic characters are being used more and more as a research tool in social and affective neuroscience to better capture both the dynamics of emotion communication and the unintentional and automatic nature of emotional processes. While idle motion (i.e., non-communicative movements) is commonly used to create behavioural realism, its use to enhance the perception of emotion expressed by a virtual character is critically lacking. This study examined the influence of naturalistic (i.e., based on human motion capture) idle motion on two aspects (the perception of other’s pain and affective reaction) of an empathic response towards pain expressed by a virtual character. In two experiments, 32 and 34 healthy young adults were presented video clips of a virtual character displaying a facial expression of pain while its body was either static (still condition) or animated with natural postural oscillations (idle condition). The participants in Experiment 1 rated the facial pain expression of the virtual human as more intense, and those in Experiment 2 reported being more touched by its pain expression in the idle condition compared to the still condition, indicating a greater empathic response towards the virtual human’s pain in the presence of natural postural oscillations. These findings are discussed in relation to the models of empathy and biological motion processing. Future investigations will help determine to what extent such naturalistic idle motion could be a key ingredient in enhancing the anthropomorphism of a virtual human and making its emotion appear more genuine.


1988 ◽  
Vol 82 (3) ◽  
pp. 737-761 ◽  
Author(s):  
George E. Marcus

Over the past two decades psychological models of affect have changed from valence (one-dimensional) models to multiple-dimensional models. The most recent models, circumplex models, are two-dimensional. Feeling thermometer measures, which derive their theoretical logic from earlier (valence) models of emotional appraisal, are shown to be confounded. Underlying the variation obtained using feeling thermometer measures are two dimensions of emotional response, mastery (positive emotionality) and threat (negative emotionality). Analysis of the 1984 NES survey suggests that positive emotional response is twice as influential as negative emotional response in predicting presidential candidate vote disposition to the presidential candidates. Reliance on emotional response is shown to be uniformly influential across various strata of the electorate.Policy considerations have little direct influence on vote disposition, though policy considerations are indirectly related to vote disposition through the influence of issues on the degree of feelings of threat evoked by the candidates.


2019 ◽  
Vol 29 (5) ◽  
pp. 676-696 ◽  
Author(s):  
Sabrina Golonka ◽  
Andrew D. Wilson

In 2010, Bechtel and Abrahamsen defined and described what it means to be a dynamic causal mechanistic explanatory model. They discussed the development of a mechanistic explanation of circadian rhythms as an exemplar of the process and challenged cognitive science to follow this example. This article takes on that challenge. A mechanistic model is one that accurately represents the real parts and operations of the mechanism being studied. These real components must be identified by an empirical programme that decomposes the system at the correct scale and localises the components in space and time. Psychological behaviour emerges from the nature of our real-time interaction with our environments—here we show that the correct scale to guide decomposition is picked out by the ecological perceptual information that enables that interaction. As proof of concept, we show that a simple model of coordinated rhythmic movement, grounded in information, is a genuine dynamical mechanistic explanation of many key coordination phenomena.


Author(s):  
Xerxes D. Arsiwalla ◽  
Riccardo Zucca ◽  
Alberto Betella ◽  
Enrique Martinez ◽  
David Dalmazzo ◽  
...  

2006 ◽  
Vol 5 (2) ◽  
pp. 25-30 ◽  
Author(s):  
Christian Knöpfle ◽  
Yvonne Jung

In this paper, we will explain our approach to create and animate virtual characters for real-time rendering applications in an easy and intuitive way. Furthermore we show a way how to develop interactive storylines for such real-time environments involving the created characters. We outline useful extensions for character animation based on the VRML97 and X3D standards and describe how to incorporate commercial tools for an optimized workflow. These results were developed within the Virtual Human project. An overview of the project is included in this paper


Sign in / Sign up

Export Citation Format

Share Document