scholarly journals Psychological Aspects in lifelike synthetic agents: Towards to the Personality Markup Language (A Brief Survey)

RENOTE ◽  
2009 ◽  
Vol 7 (3) ◽  
pp. 390-400
Author(s):  
Maria Augusta Silveira Netto Nunes

This paper describes how human psychological aspects have been used in lifelike synthetic agents in order to provide believability during the human-computer interaction. We describe a brief survey of applications where Affective Computing Scientists have applied psychological aspects, like Emotion and Personality. Based on those aspects we describe the effort done by Affective Computing scientists in order to create a Markup Language to express and standardize Emotions. Because they have not yet concentrated their effort on Personality, here, we propose a starting point to create a Markup Language to express Personality.

Author(s):  
Lesley Axelrod ◽  
Kate Hone

In a culture which places increasing emphasis on happiness and wellbeing, multimedia technologies include emotional design to improve commercial edge. This chapter explores affective computing and illustrates how innovative technologies are capable of emotional recognition and display. Research in this domain has emphasised solving the technical difficulties involved, through the design of ever more complex recognition algorithms. But fundamental questions about the use of such technology remain neglected. Can it really improve human-computer interaction? For which types of application is it suitable? How is it best implemented? What ethical considerations are there? We review this field and discuss the need for user-centred design. We describe and give evidence from a study that explores some of the user issues in affective computing.


Sensors ◽  
2020 ◽  
Vol 20 (8) ◽  
pp. 2308 ◽  
Author(s):  
Dilana Hazer-Rau ◽  
Sascha Meudt ◽  
Andreas Daucher ◽  
Jennifer Spohrs ◽  
Holger Hoffmann ◽  
...  

In this paper, we present a multimodal dataset for affective computing research acquired in a human-computer interaction (HCI) setting. An experimental mobile and interactive scenario was designed and implemented based on a gamified generic paradigm for the induction of dialog-based HCI relevant emotional and cognitive load states. It consists of six experimental sequences, inducing Interest, Overload, Normal, Easy, Underload, and Frustration. Each sequence is followed by subjective feedbacks to validate the induction, a respiration baseline to level off the physiological reactions, and a summary of results. Further, prior to the experiment, three questionnaires related to emotion regulation (ERQ), emotional control (TEIQue-SF), and personality traits (TIPI) were collected from each subject to evaluate the stability of the induction paradigm. Based on this HCI scenario, the University of Ulm Multimodal Affective Corpus (uulmMAC), consisting of two homogenous samples of 60 participants and 100 recording sessions was generated. We recorded 16 sensor modalities including 4 × video, 3 × audio, and 7 × biophysiological, depth, and pose streams. Further, additional labels and annotations were also collected. After recording, all data were post-processed and checked for technical and signal quality, resulting in the final uulmMAC dataset of 57 subjects and 95 recording sessions. The evaluation of the reported subjective feedbacks shows significant differences between the sequences, well consistent with the induced states, and the analysis of the questionnaires shows stable results. In summary, our uulmMAC database is a valuable contribution for the field of affective computing and multimodal data analysis: Acquired in a mobile interactive scenario close to real HCI, it consists of a large number of subjects and allows transtemporal investigations. Validated via subjective feedbacks and checked for quality issues, it can be used for affective computing and machine learning applications.


2021 ◽  
Author(s):  
Michael J Lyons

Twenty-five years ago, my colleagues Miyuki Kamachi and Jiro Gyoba and I designed and photographed JAFFE, a set of facial expression images intended for use in a study of face perception. In 2019, without seeking permission or informing us, Kate Crawford and Trevor Paglen exhibited JAFFE in two widely publicized art shows. In addition, they published a nonfactual account of the images in the essay “Excavating AI: The Politics of Images in Machine Learning Training Sets.” The present article recounts the creation of the JAFFE dataset and unravels each of Crawford and Paglen’s fallacious statements. I also discuss JAFFE more broadly in connection with research on facial expression, affective computing, and human-computer interaction.


2019 ◽  
pp. 298-313
Author(s):  
Aníbal Caixinha ◽  
Isabel Machado Alexandre

Dementia is, unfortunately, a well-known problem of nowadays, product of a set of generational transformations and a result of better life conditions. There are different kinds of dementia but Alzheimer is the one with predominance. Memory is the key factor in this type of illness and it is nuclear to understand how it is constructed to hypothesise and try to determine how it degenerates. In this chapter, memory structures are presented as a starting point of the research and then through the use of narrative intelligence we devise a method to present small excerpts of patient's history and simultaneously illness progression is evaluated. To do this, a small prototype of MEM+ has been developed, and for its development a participatory design was conducted. With this approach, we aim to devise the right application to be used by the patients themselves and by their caregivers. During this stage of the project special attention was paid to usability issues, and some adaptations made to better the human computer interaction.


2021 ◽  
Vol 2 (1) ◽  
pp. 26-32
Author(s):  
Moe Moe Htay

Facial Expression is a significant role in affective computing and one of the non-verbal communication for human computer interaction. Automatic recognition of human affects has become more challenging and interesting problem in recent years. Facial Expression is the significant features to recognize the human emotion in human daily life. Facial Expression Recognition System (FERS) can be developed for the application of human affect analysis, health care assessment, distance learning, driver fatigue detection and human computer interaction. Basically, there are three main components to recognize the human facial expression. They are face or face’s components detection, feature extraction of face image, classification of expression. The study proposed the methods of feature extraction and classification for FER.


2011 ◽  
Vol 2011 ◽  
pp. 1-11 ◽  
Author(s):  
Walter Ritter

A lot of efforts have been directed to enriching human-computer interaction to make the user experience more pleasing or efficient. In this paper, we briefly present work in the fields of subliminal perception and affective computing, before we outline a new approach to add analog communication channels to the human-computer interaction experience. In this approach, in addition to symbolic predefined mappings of input to output, a subliminal feedback loop is used that provides feedback in evolutionary subliminal steps. In two studies involving concentration-intensive games, we investigated the impact of this approach. In a first study evolutionary feedback loops adjusted the user interface of a memory game whereas in the second study the lighting of the test room was adjusted dynamically. The results show that in settings with an evolutionary feedback loop test participants were able to reach significantly higher scores compared to the static counterparts. Finally, we discuss the impact that such subliminally working applications might have on the user's acceptance.


Author(s):  
Aníbal Caixinha ◽  
Isabel Machado Alexandre

Dementia is, unfortunately, a well-known problem of nowadays, product of a set of generational transformations and a result of better life conditions. There are different kinds of dementia but Alzheimer is the one with predominance. Memory is the key factor in this type of illness and it is nuclear to understand how it is constructed to hypothesise and try to determine how it degenerates. In this chapter, memory structures are presented as a starting point of the research and then through the use of narrative intelligence we devise a method to present small excerpts of patient's history and simultaneously illness progression is evaluated. To do this, a small prototype of MEM+ has been developed, and for its development a participatory design was conducted. With this approach, we aim to devise the right application to be used by the patients themselves and by their caregivers. During this stage of the project special attention was paid to usability issues, and some adaptations made to better the human computer interaction.


Sign in / Sign up

Export Citation Format

Share Document