Age-Invariance in the Asymmetry of Stimulus-Evoked Emotional Facial Muscle Activity

2000 ◽  
Vol 7 (3) ◽  
pp. 156-168 ◽  
Author(s):  
Sheryl L. Reminger ◽  
Alfred W. Kaszniak ◽  
Patricia R. Dalby
2021 ◽  
Vol 70 ◽  
pp. 1-10
Author(s):  
Sara Casaccia ◽  
Erik J. Sirevaag ◽  
Mark G. Frank ◽  
Joseph A. O'Sullivan ◽  
Lorenzo Scalise ◽  
...  

2019 ◽  
Vol 16 (6) ◽  
pp. 066029
Author(s):  
Gizem Yilmaz ◽  
Abdullah Salih Budan ◽  
Pekcan Ungan ◽  
Betilay Topkara ◽  
Kemal S Türker

2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Alexandre C. Fernandes ◽  
Teresa Garcia-Marques

AbstractTime perception relies on the motor system. Involves core brain regions of this system, including those associated with feelings generated from sensorimotor states. Perceptual timing is also distorted when movement occurs during timing tasks, possibly by interfering with sensorimotor afferent feedback. However, it is unknown if the perception of time is an active process associated with specific patterns of muscle activity. We explored this idea based on the phenomenon of electromyographic gradients, which consists of the dynamic increase of muscle activity during cognitive tasks that require sustained attention, a critical function in perceptual timing. We aimed to determine whether facial muscle dynamic activity indexes the subjective representation of time. We asked participants to judge stimuli durations (varying in familiarity) while we monitored the time course of the activity of the zygomaticus-major and corrugator-supercilii muscles, both associated with cognitive and affective feelings. The dynamic electromyographic activity in corrugator-supercilii over time reflected objective time and this relationship predicted subjective judgments of duration. Furthermore, the zygomaticus-major muscle signaled the bias that familiarity introduces in duration judgments. This suggests that subjective duration could be an embodiment process based in motor information changing over time and their associated feelings.


2020 ◽  
Vol 1 (4) ◽  
pp. 208-224
Author(s):  
Kornelia Gentsch ◽  
Ursula Beermann ◽  
Lingdan Wu ◽  
Stéphanie Trznadel ◽  
Klaus R. Scherer

AbstractAppraisal theories suggest that valence appraisal should be differentiated into micro-valences, such as intrinsic pleasantness and goal-/need-related appraisals. In contrast to a macro-valence approach, this dissociation explains, among other things, the emergence of mixed or blended emotions. Here, we extend earlier research that showed that these valence types can be empirically dissociated. We examine the timing and the response patterns of these two micro-valences via measuring facial muscle activity changes (electromyography, EMG) over the brow and the cheek regions. In addition, we explore the effects of the sensory stimulus modality (vision, audition, and olfaction) on these patterns. The two micro-valences were manipulated in a social judgment task: first, intrinsic un/pleasantness (IP) was manipulated by exposing participants to appropriate stimuli presented in different sensory domains followed by a goal conduciveness/obstruction (GC) manipulation consisting of feedback on participants’ judgments that were congruent or incongruent with their task-related goal. The results show significantly different EMG responses and timing patterns for both types of micro-valence, confirming the prediction that they are independent, consecutive parts of the appraisal process. Moreover, the lack of interaction effects with the sensory stimulus modality suggests high generalizability of the underlying appraisal mechanisms across different perception channels.


Author(s):  
Sridhar Arjunan ◽  
Dinesh Kant Kumar ◽  
Hans Weghorn ◽  
Ganesh Naik

The need for developing reliable and flexible human computer interface is increased and applications of HCI have been in each and every field. Human factors play an important role in these kinds of interfaces. Research and development of new human computer interaction (HCI) techniques that enhance the flexibility and reliability for the user are important. Research on new methods of computer control has focused on three types of body functions: speech, bioelectrical activity, and use of mechanical sensors. Speech operated systems have the advantage that these provide the user with flexibility. Such systems have the potential for making computer control effortless and natural. This chapter summarizes research conducted to investigate the use of facial muscle activity for a reliable interface to identify voiceless speech based commands without any audio signals. System performance and reliability have been tested to study inter-subject and inter-day variations and impact of the native language of the speaker. The experimental results indicate that such a system has high degree of inter-subject and inter-day variations. The results also indicate that the variations of the style of speaking in the native language are low but are high when the speaker speaks in a foreign language. The results also indicate that such a system is suitable for a very small vocabulary. The authors suggest that facial sEMG based speech recognition systems may only find limited applications.


Sign in / Sign up

Export Citation Format

Share Document