vocal affect
Recently Published Documents


TOTAL DOCUMENTS

54
(FIVE YEARS 7)

H-INDEX

17
(FIVE YEARS 0)

Author(s):  
Thomas I. Vaughan-Johnston ◽  
Joshua J. Guyer ◽  
Leandre R. Fabrigar ◽  
Charlie Shen

AbstractPast research has largely focused on how emotional expressions provide information about the speaker’s emotional state, but has generally neglected vocal affect’s influence over communication effectiveness. This is surprising given that other nonverbal behaviors often influence communication between individuals. In the present theory paper, we develop a novel perspective called the Contextual Influences of Vocal Affect (CIVA) model to predict and explain the psychological processes by which vocal affect may influence communication through three broad categories of process: emotion origin/construal, changing emotions, and communication source inferences. We describe research that explores potential moderators (e.g., affective/cognitive message types, message intensity), and mechanisms (e.g., emotional assimilation, attributions, surprise) shaping the effects of vocally expressed emotions on communication. We discuss when and why emotions expressed through the voice can influence the effectiveness of communication. CIVA advances theoretical and applied psychology by providing a clear theoretical account of vocal affect’s diverse impacts on communication.


2021 ◽  
Author(s):  
Hardik Kothare ◽  
Vikram Ramanarayanan ◽  
Oliver Roesler ◽  
Michael Neumann ◽  
Jackson Liscombe ◽  
...  

We explore the utility of an on-demand multimodal conversational platform in extracting speech and facial metrics in children with Autism Spectrum Disorder (ASD). We investigate the extent to which these metrics correlate with objective clinical measures, particularly as they pertain to the interplay between the affective, phonatory and motoric subsystems. 22 participants diagnosed with ASD engaged with a virtual agent in conversational affect production tasks designed to elicit facial and vocal affect. We found significant correlations between vocal pitch and loudness extracted by our platform during these tasks and accuracy in recognition of facial and vocal affect, assessed via the Diagnostic Analysis of Nonverbal Accuracy-2 (DANVA-2) neuropsychological task. We also found significant correlations between jaw kinematic metrics extracted using our platform and motor speed of the dominant hand assessed via a standardised neuropsychological finger tapping task. These findings offer preliminary evidence for the usefulness of these audiovisual analytic metrics and could help us better model the interplay between different physiological subsystems in individuals with ASD.


2021 ◽  
Vol 89 (3) ◽  
pp. 227-239
Author(s):  
Adar Paz ◽  
Eshkol Rafaeli ◽  
Eran Bar-Kalifa ◽  
Eva Gilboa-Schectman ◽  
Sharon Gannot ◽  
...  
Keyword(s):  

2021 ◽  
pp. 1-21
Author(s):  
Barbra Zupan ◽  
Leah Dunn ◽  
Susanne Hackney ◽  
Bahtiyorhon Shamshidinova

Abstract The purpose of this review was to explore how vocal affect recognition deficits impact the psychosocial functioning of people with moderate to severe traumatic brain injury (TBI). A systematic review following the Preferred Reporting Items for Systematic Review and Meta-Analyses (PRISMA) guidelines was conducted, whereby six databases were searched, with additional hand searching of key journals also completed. The search identified 1847 records after duplicates were removed, and 1749 were excluded through title and abstract screening. After full text screening of 65 peer-reviewed articles published between January 1999 and August 2019, only five met inclusion criteria. The methodological quality of selected studies was assessed using the Mixed Methods Appraisal Tool (MMAT) Version 2018 with a fair level of agreement reached. A narrative synthesis of the results was completed, exploring vocal affect recognition and psychosocial functioning of people with moderate to severe TBI, including aspects of social cognition (i.e., empathy; Theory of Mind) and social behaviour. Results of the review were limited by a paucity of research in this area, a lack of high-level evidence, and wide variation in the outcome measures used. More rigorous study designs are required to establish more conclusive evidence regarding the degree and direction of the association between vocal affect recognition and aspects of psychosocial functioning. This review is registered with Prospero.


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Alex S. Cohen ◽  
Christopher R. Cox ◽  
Thanh P. Le ◽  
Tovah Cowan ◽  
Michael D. Masucci ◽  
...  

Abstract Negative symptoms are a transdiagnostic feature of serious mental illness (SMI) that can be potentially “digitally phenotyped” using objective vocal analysis. In prior studies, vocal measures show low convergence with clinical ratings, potentially because analysis has used small, constrained acoustic feature sets. We sought to evaluate (1) whether clinically rated blunted vocal affect (BvA)/alogia could be accurately modelled using machine learning (ML) with a large feature set from two separate tasks (i.e., a 20-s “picture” and a 60-s “free-recall” task), (2) whether “Predicted” BvA/alogia (computed from the ML model) are associated with demographics, diagnosis, psychiatric symptoms, and cognitive/social functioning, and (3) which key vocal features are central to BvA/Alogia ratings. Accuracy was high (>90%) and was improved when computed separately by speaking task. ML scores were associated with poor cognitive performance and social functioning and were higher in patients with schizophrenia versus depression or mania diagnoses. However, the features identified as most predictive of BvA/Alogia were generally not considered critical to their operational definitions. Implications for validating and implementing digital phenotyping to reduce SMI burden are discussed.


2020 ◽  
pp. 175407392093079
Author(s):  
Gregory A. Bryant

Vocal affect is a subcomponent of emotion programs that coordinate a variety of physiological and psychological systems. Emotional vocalizations comprise a suite of vocal behaviors shaped by evolution to solve adaptive social communication problems. The acoustic forms of vocal emotions are often explicable with reference to the communicative functions they serve. An adaptationist approach to vocal emotions requires that we distinguish between evolved signals and byproduct cues, and understand vocal affect as a collection of multiple strategic communicative systems subject to the evolutionary dynamics described by signaling theory. We should expect variability across disparate societies in vocal emotion according to culturally evolved pragmatic rules, and universals in vocal production and perception to the extent that form–function relationships are present.


2019 ◽  
Vol 22 (1) ◽  
pp. 15-34
Author(s):  
Daniela Hekiert ◽  
Magdalena Igras-Cybulska

People use their voices to communicate not only verbally but also emotionally. This article presents theories and methodologies that concern emotional vocalizations at the intersection of psychology and digital signal processing. Specifically, it demonstrates the encoding (production) and decoding (recognition) of emotional sounds, including the review and comparison of strategies in database design, parameterization, and classification. Whereas psychology predominantly focuses on the subjective recognition of emotional vocalizations, digital signal processing relies on automated and thus more objective vocal affect measures. The article aims to compare these two approaches and suggest methods of combining them to achieve a more complex insight into the vocal communication of emotions.


2018 ◽  
Vol 236 (7) ◽  
pp. 1911-1918 ◽  
Author(s):  
Martijn Baart ◽  
Jean Vroomen
Keyword(s):  

2018 ◽  
Vol 74 ◽  
pp. 161-173
Author(s):  
Joshua J. Guyer ◽  
Leandre R. Fabrigar ◽  
Thomas I. Vaughan-Johnston ◽  
Clement Tang

Sign in / Sign up

Export Citation Format

Share Document