affective content
Recently Published Documents


TOTAL DOCUMENTS

142
(FIVE YEARS 36)

H-INDEX

20
(FIVE YEARS 2)

2022 ◽  
pp. 002224292210749
Author(s):  
Filippo Dall'Olio ◽  
Demetrios Vakratsas

This study provides a comprehensive assessment of the impact of Advertising Creative Strategy (ACS) on advertising elasticity, founded on an integrative framework which distinguishes between the Function (content) and the Form (execution) of an advertising creative. Function is evaluated using a three-dimensional representation of content (Experience, Affect, Cognition), whereas the representation of Form accounts for both executional elements and the use of creative templates. The distinction between Function and Form allows for the investigation of potential synergies between content and execution, previously unaccounted for in the literature. The ACS framework also facilitates the calculation of composite metrics that capture holistic aspects of the creative strategy, such as Focus, or the extent of the emphasis on a specific content dimension, and Variation i.e., changes in content and execution over time. The empirical application focuses on a Dynamic Linear Model analysis of 2251 television advertising creatives from 91 brands in 16 consumer packaged goods categories. The findings suggest that in terms of Function, experiential content has the biggest effect on elasticity, followed by cognitive and affective content. Function and Form produce synergies that can be leveraged by advertisers to increase returns. Finally, Focus, Variation and the use of templates increase advertising elasticity.


2021 ◽  
pp. 73-98
Author(s):  
Hyunjin Seo

This chapter offers a detailed analysis of online and offline interactions and information exchanges that took place in organizing candlelight vigils in 2016–2017 that contributed to the impeachment of President Park Geun-hye. Interactions between agents and affordances resulted in the nation’s first removal of a president through impeachment processes. Key agents—in particular, journalists, social media influencers, citizens, activists, news organizations, and civic organizations—interacted to produce, share, and amplify cognitive and affective content resulting in massive citizen participation in candlelight vigils for 20 consecutive weeks. It provides an in-depth analysis of these and related issues based on interviews with journalists, activists, citizens, government officials, and technology company representatives and experts. The interview data are triangulated using analyses of news reports and social media posts.


i-Perception ◽  
2021 ◽  
Vol 12 (6) ◽  
pp. 204166952110485
Author(s):  
Laura J. Speed ◽  
Ilja Croijmans ◽  
Sarah Dolscheid ◽  
Asifa Majid

People associate information with different senses but the mechanism by which this happens is unclear. Such associations are thought to arise from innate structural associations in the brain, statistical associations in the environment, via shared affective content, or through language. A developmental perspective on crossmodal associations can help determine which explanations are more likely for specific associations. Certain associations with pitch (e.g., pitch–height) have been observed early in infancy, but others may only occur late into childhood (e.g., pitch–size). In contrast, tactile–chroma associations have been observed in children, but not adults. One modality that has received little attention developmentally is olfaction. In the present investigation, we explored crossmodal associations from sound, tactile stimuli, and odor to a range of stimuli by testing a broad range of participants. Across the three modalities, we found little evidence for crossmodal associations in young children. This suggests an account based on innate structures is unlikely. Instead, the number and strength of associations increased over the lifespan. This suggests that experience plays a crucial role in crossmodal associations from sound, touch, and smell to other senses.


Author(s):  
Shanu Sharma ◽  
Ashwani Kumar Dubey ◽  
Priya Ranjan ◽  
Alvaro Rocha

2021 ◽  
Vol 11 (7) ◽  
pp. 942
Author(s):  
Antonio Maffei ◽  
Jennifer Goertzen ◽  
Fern Jaspers-Fayer ◽  
Killian Kleffner ◽  
Paola Sessa ◽  
...  

Behavioral and electrophysiological correlates of the influence of task demands on the processing of happy, sad, and fearful expressions were investigated in a within-subjects study that compared a perceptual distraction condition with task-irrelevant faces (e.g., covert emotion task) to an emotion task-relevant categorization condition (e.g., overt emotion task). A state-of-the-art non-parametric mass univariate analysis method was used to address the limitations of previous studies. Behaviorally, participants responded faster to overtly categorized happy faces and were slower and less accurate to categorize sad and fearful faces; there were no behavioral differences in the covert task. Event-related potential (ERP) responses to the emotional expressions included the N170 (140–180 ms), which was enhanced by emotion irrespective of task, with happy and sad expressions eliciting greater amplitudes than neutral expressions. EPN (200–400 ms) amplitude was modulated by task, with greater voltages in the overt condition, and by emotion, however, there was no interaction of emotion and task. ERP activity was modulated by emotion as a function of task only at a late processing stage, which included the LPP (500–800 ms), with fearful and sad faces showing greater amplitude enhancements than happy faces. This study reveals that affective content does not necessarily require attention in the early stages of face processing, supporting recent evidence that the core and extended parts of the face processing system act in parallel, rather than serially. The role of voluntary attention starts at an intermediate stage, and fully modulates the response to emotional content in the final stage of processing.


Author(s):  
Sanda Ismail ◽  
Emily Dodd ◽  
Gary Christopher ◽  
Tim Wildschut ◽  
Constantine Sedikides ◽  
...  

Although dementia may affect the reliability of autobiographical memories, the psychological properties of nostalgic memories may be preserved. We compared the content of nostalgic ( n = 36) and ordinary ( n = 31) narratives of 67 participants living with dementia. Narratives were rated according to their self-oriented, social, and existential properties, as well as their affective content. Social properties and affective content were assessed using a linguistic word count procedure. Compared to the ordinary narratives described in the control condition, nostalgic narratives described a typical events, expressed more positive affect, and had more expressions of self-esteem and self-continuity. They were also rated higher on companionship, connectedness and the closeness of relationships, and reflected life as being meaningful. Despite their cognitive impairment, people living with dementia experience nostalgia in similar ways to cognitively healthy adults, with their nostalgic narratives containing self-oriented, social, and existential properties.


2021 ◽  
Vol 12 ◽  
Author(s):  
Alessandra Nicoletta Cruz Yu ◽  
Pierpaolo Iodice ◽  
Giovanni Pezzulo ◽  
Laura Barca

According to embodied theories, the processing of emotions such as happiness or fear is grounded in emotion-specific perceptual, bodily, and physiological processes. Under these views, perceiving an emotional stimulus (e.g., a fearful face) re-enacts interoceptive and bodily states congruent with that emotion (e.g., increases heart rate); and in turn, interoceptive and bodily changes (e.g., increases of heart rate) influence the processing of congruent emotional content. A previous study by Pezzulo et al. (2018) provided evidence for this embodied congruence, reporting that experimentally increasing heart rate with physical exercise facilitated the processing of facial expressions congruent with that interoception (fear), but not those conveying incongruent states (disgust or neutrality). Here, we investigated whether the above (bottom-up) interoceptive manipulation and the (top-down) priming of affective content may jointly influence the processing of happy and fearful faces. The fact that happiness and fear are both associated with high heart rate but have different (positive and negative) valence permits testing the hypothesis that their processing might be facilitated by the same interoceptive manipulation (the increase of heart rate) but two opposite (positive and negative) affective primes. To test this hypothesis, we asked participants to perform a gender-categorization task of happy, fearful, and neutral faces, which were preceded by positive, negative, and neutral primes. Participants performed the same task in two sessions (after rest, with normal heart rate, or exercise, with faster heart rate) and we recorded their response times and mouse movements during the choices. We replicated the finding that when participants were in the exercise condition, they processed fearful faces faster than when they were in the rest condition. However, we did not find the same reduction in response time for happy (or neutral) faces. Furthermore, we found that when participants were in the exercise condition, they processed fearful faces faster in the presence of negative compared to positive or neutral primes; but we found no equivalent facilitation of positive (or neutral) primes during the processing of happy (or neutral) faces. While the asymmetries between the processing of fearful and happy faces require further investigation, our findings promisingly indicate that the processing of fearful faces is jointly influenced by both bottom-up interoceptive states and top-down affective primes that are congruent with the emotion.


Author(s):  
Claudia Kawai ◽  
Gáspár Lukács ◽  
Ulrich Ansorge

AbstractWe introduce the Bicolor Affective Silhouettes and Shapes (BASS): a set of 583 normed black-and-white silhouette images that is freely available via https://osf.io/anej6/. Valence and arousal ratings were obtained for each image from US residents as a Western population (n = 777) and Chinese residents as an Asian population (n = 869). Importantly, the ratings demonstrate that, notwithstanding their visual simplicity, the images represent a wide range of affective content (from very negative to very positive, and from very calm to very intense). In addition, speaking to their cultural neutrality, the valence ratings correlated very highly between US and Chinese ratings. Arousal ratings were less consistent between the two samples, with larger discrepancies in the older age groups inviting further investigation. Due to their simplistic and abstract nature, our silhouette images may be useful for intercultural studies, color and shape perception research, and online stimulus presentation in particular. We demonstrate the versatility of the BASS by an example online experiment.


Author(s):  
Christoforos Christoforou ◽  
Maria Theodorou

Emotions affect our decisions, experiences, preferences, and perceptions. Understanding the neural underpinning of human emotions is a fundamental goal of neuroscience research. Moreover, EEG-based emotion recognition is a key component towards the development of affective-aware intelligent systems. However, characterizing the neural basis of emotions elicited during video viewing has been proven a challenging task. In this paper, we propose a novel machine-learning approach to isolate neural components in EEG signals that are informative of the affective content of emotionally-loaded videos. Based on these components, we define a set of neural metrics and evaluate them as potential indicators of the overall emotional content of each video. We demonstrate the predictive power of the proposed metrics, on the DEAP benchmark dataset for EEG-based emotion recognition. Our results provide novel empirical evidence that the neural components extracted by our method can serve as an informative metric in EEG-based emotion recognition during video viewing and achieving a 4-fold increase in predictive power compared to traditional frequency-based metrics. Moreover, each extracted component is associated with a spatial and a temporal profile, that allows researchers to inspect and interpret the spatiotemporal origins of the underlying neural activity. Thus, our method a framework that facilitates the study of neural correlates of emotion during video viewing.


Sign in / Sign up

Export Citation Format

Share Document