nonverbal cues
Recently Published Documents


TOTAL DOCUMENTS

317
(FIVE YEARS 58)

H-INDEX

31
(FIVE YEARS 3)

2021 ◽  
Author(s):  
Darlene Barker ◽  
Haim Levkowitz

One of the first senses we learn about at birth is touch, and the one sense that can deepen our experience of many situations is touch. In this paper we propose the use of emotions including touch within virtual reality (VR) to create a simulated closeness that currently can only be achieved with in-person interactions and communications. With the simulation of nonverbal cues, we can enhance a conversation or interaction in VR. Using haptic devices to deliver the simulation of touch between users via sensors and machine learning for emotion recognition based on data collected; all working towards simulated closeness in communication despite distance or being in VR. We present a direction for further research on how to simulate inperson communication within VR with the use of emotion recognition and touch to achieve a close-to-real interaction.


2021 ◽  
Author(s):  
Oya Celiktutan ◽  
Alexandra Livia Georgescu ◽  
Nicholas Cummins
Keyword(s):  

2021 ◽  
Author(s):  
Rahil Satyanarayan Vijay ◽  
Kumar Shubham ◽  
Laetitia Aurelie Renier ◽  
Emmanuelle P. Kleinlogel ◽  
Marianne Schmid Mast ◽  
...  

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Philip Furley ◽  
Florian Klingner ◽  
Daniel Memmert

AbstractThe present research attempted to extend prior research that showed that thin-slices of pre-performance nonverbal behavior (NVB) of professional darts players gives valid information to observers about subsequent performance tendencies. Specifically, we investigated what kind of nonverbal cues were associated with success and informed thin-slice ratings. Participants (N = 61) were first asked to estimate the performance of a random sample of videos showing the preparatory NVB of professional darts players (N = 47) either performing well (470 clips) or poorly (470 clips). Preparatory NVB was assessed via preparation times and Active Appearance Modeling using Noldus FaceReader. Results showed that observers could distinguish between good and poor performance based on thin-slices of preparatory NVB (p = 0.001, d = 0.87). Further analyses showed that facial expressions prior to poor performance showed more arousal (p = 0.011, ƞ2p = 0.10), sadness (p = 0.040, ƞ2p = 0.04), and anxiety (p = 0.009, ƞ2p = 0.09) and preparation times were shorter (p = 0.001, ƞ2p = 0.36) prior to poor performance than good performance. Lens model analyses showed preparation times (p = 0.001, rho = 0.18), neutral (p = 0.001, rho = 0.13), sad (rho = 0.12), and facial expressions of arousal (p = 0.001, rho = 0.11) to be correlated with observers’ performance ratings. Hence, preparation times and facial cues associated with a player’s level of arousal, neutrality, and sadness seem to be valid nonverbal cues that observers utilize to infer information about subsequent perceptual-motor performance.


Author(s):  
Yi Lin ◽  
Hongwei Ding ◽  
Yang Zhang

Purpose This study aimed to examine the Stroop effects of verbal and nonverbal cues and their relative impacts on gender differences in unisensory and multisensory emotion perception. Method Experiment 1 investigated how well 88 normal Chinese adults (43 women and 45 men) could identify emotions conveyed through face, prosody and semantics as three independent channels. Experiments 2 and 3 further explored gender differences during multisensory integration of emotion through a cross-channel (prosody-semantics) and a cross-modal (face-prosody-semantics) Stroop task, respectively, in which 78 participants (41 women and 37 men) were asked to selectively attend to one of the two or three communication channels. Results The integration of accuracy and reaction time data indicated that paralinguistic cues (i.e., face and prosody) of emotions were consistently more salient than linguistic ones (i.e., semantics) throughout the study. Additionally, women demonstrated advantages in processing all three types of emotional signals in the unisensory task, but only preserved their strengths in paralinguistic processing and showed greater Stroop effects of nonverbal cues on verbal ones during multisensory perception. Conclusions These findings demonstrate clear gender differences in verbal and nonverbal emotion perception that are modulated by sensory channels, which have important theoretical and practical implications. Supplemental Material https://doi.org/10.23641/asha.16435599


Author(s):  
Maximilian Blomberg ◽  
Katja Schlegel ◽  
Linda Stoll ◽  
Hagen Febry ◽  
Wally Wünsch‐Leiteritz ◽  
...  

2021 ◽  
Author(s):  
Xiaoming Jiang

Communicative expression is a cross-species phenomenon. We investigated the perceptual attributes of social expressions encoded in human-like animal stickers commonly used as nonverbal communicative tools on social media (e.g. WeChat). One hundred and twenty animal stickers which varied in 12 categories of social expressions (serving pragmatic or emotional functions), 5 animal kinds (cats, dogs, ducks, rabbits, pigs) and 2 presented forms (real animal vs. cartoon animal) were presented to social media users, who were asked to rate on the human likeness, the cuteness, the expressiveness and the matchness of each intended expression against the given label. The data shows that the kind of animal that is expected to best encode a certain expression is modulated by its presented forms. The “cuteness” stereotype towards a certain kind of animal is sometimes violated as a function of the presented forms. Moreover, user’s gender, interpersonal sensitivity and attitudes towards the ethic use of animals modulated various perceptual attributes. These findings highlight the factors underlying the decoding of social meanings in human-like animal stickers as nonverbal cues in virtual communication.


Sign in / Sign up

Export Citation Format

Share Document