BACKGROUND
Emotions and mood are important for our overall well-being. The search for continuous, effortless emotion prediction methods is, therefore, an important field of study. Mobile sensing provides a promising tool and can capture one of the most telling signs of emotions: language.
OBJECTIVE
The aim of this study is to examine the separate and combined predictive value of mobile-sensed language data sources for detecting both momentary emotional experience as well as global individual differences in emotional traits and depression.
METHODS
In a two-week experience sampling method study, we collected self-report emotion ratings and voice recordings 10 times/day, continuous keyboard activity, and trait depression severity. We correlated state and trait emotions/depression and language, distinguishing between speech content (spoken words), speech form (voice acoustics), writing content (written words), and writing form (typing dynamics). We also investigated how well these features predict state and trait emotions using cross-validation to select features and a hold-out set for validation.
RESULTS
Overall, reported emotions and mobile-sensed language demonstrated weak correlations. Most significant correlations were found between speech content and state emotions and speech form and state emotions, ranging to 0.25. Speech content provided the best predictions for the state emotions. None of the trait emotion-language correlations remained significant after correction. Among the emotions studied, valence and happiness displayed the most significant correlations and the highest predictive performance.
CONCLUSIONS
While using mobile-sensed language as emotion marker shows some promise, correlations and predictive R²s are low.