From Movement to Action

Author(s):  
Peggy Mason

Tracts descending from motor control centers in the brainstem and cortex target motor interneurons and in select cases motoneurons. The mechanisms and constraints of postural control are elaborated and the effect of body mass on posture discussed. Feed-forward reflexes that maintain posture during standing and other conditions of self-motion are described. The role of descending tracts in postural control and the pathological posturing is described. Pyramidal (corticospinal and corticobulbar) and extrapyramidal control of body and face movements is contrasted. Special emphasis is placed on cortical regions and tracts involved in deliberate control of facial expression; these pathways are contrasted with mechanisms for generating emotional facial expressions. The signs associated with lesions of either motoneurons or motor control centers are clearly detailed. The mechanisms and presentation of cerebral palsy are described. Finally, understanding how pre-motor cortical regions generate actions is used to introduce apraxia, a disorder of action.

2012 ◽  
Vol 110 (1) ◽  
pp. 338-350 ◽  
Author(s):  
Mariano Chóliz ◽  
Enrique G. Fernández-Abascal

Recognition of emotional facial expressions is a central area in the psychology of emotion. This study presents two experiments. The first experiment analyzed recognition accuracy for basic emotions including happiness, anger, fear, sadness, surprise, and disgust. 30 pictures (5 for each emotion) were displayed to 96 participants to assess recognition accuracy. The results showed that recognition accuracy varied significantly across emotions. The second experiment analyzed the effects of contextual information on recognition accuracy. Information congruent and not congruent with a facial expression was displayed before presenting pictures of facial expressions. The results of the second experiment showed that congruent information improved facial expression recognition, whereas incongruent information impaired such recognition.


2021 ◽  
pp. 174702182199299
Author(s):  
Mohamad El Haj ◽  
Emin Altintas ◽  
Ahmed A Moustafa ◽  
Abdel Halim Boudoukha

Future thinking, which is the ability to project oneself forward in time to pre-experience an event, is intimately associated with emotions. We investigated whether emotional future thinking can activate emotional facial expressions. We invited 43 participants to imagine future scenarios, cued by the words “happy,” “sad,” and “city.” Future thinking was video recorded and analysed with a facial analysis software to classify whether facial expressions (i.e., happy, sad, angry, surprised, scared, disgusted, and neutral facial expression) of participants were neutral or emotional. Analysis demonstrated higher levels of happy facial expressions during future thinking cued by the word “happy” than “sad” or “city.” In contrast, higher levels of sad facial expressions were observed during future thinking cued by the word “sad” than “happy” or “city.” Higher levels of neutral facial expressions were observed during future thinking cued by the word “city” than “happy” or “sad.” In the three conditions, the neutral facial expressions were high compared with happy and sad facial expressions. Together, emotional future thinking, at least for future scenarios cued by “happy” and “sad,” seems to trigger the corresponding facial expression. Our study provides an original physiological window into the subjective emotional experience during future thinking.


Perception ◽  
2016 ◽  
Vol 46 (5) ◽  
pp. 624-631 ◽  
Author(s):  
Andreas M. Baranowski ◽  
H. Hecht

Almost a hundred years ago, the Russian filmmaker Lev Kuleshov conducted his now famous editing experiment in which different objects were added to a given film scene featuring a neutral face. It is said that the audience interpreted the unchanged facial expression as a function of the added object (e.g., an added soup made the face express hunger). This interaction effect has been dubbed “Kuleshov effect.” In the current study, we explored the role of sound in the evaluation of facial expressions in films. Thirty participants watched different clips of faces that were intercut with neutral scenes, featuring either happy music, sad music, or no music at all. This was crossed with the facial expressions of happy, sad, or neutral. We found that the music significantly influenced participants’ emotional judgments of facial expression. Thus, the intersensory effects of music are more specific than previously thought. They alter the evaluation of film scenes and can give meaning to ambiguous situations.


2007 ◽  
Vol 38 (10) ◽  
pp. 1475-1483 ◽  
Author(s):  
K. S. Kendler ◽  
L. J. Halberstadt ◽  
F. Butera ◽  
J. Myers ◽  
T. Bouchard ◽  
...  

BackgroundWhile the role of genetic factors in self-report measures of emotion has been frequently studied, we know little about the degree to which genetic factors influence emotional facial expressions.MethodTwenty-eight pairs of monozygotic (MZ) and dizygotic (DZ) twins from the Minnesota Study of Twins Reared Apart were shown three emotion-inducing films and their facial responses recorded. These recordings were blindly scored by trained raters. Ranked correlations between twins were calculated controlling for age and sex.ResultsTwin pairs were significantly correlated for facial expressions of general positive emotions, happiness, surprise and anger, but not for general negative emotions, sadness, or disgust or average emotional intensity. MZ pairs (n=18) were more correlated than DZ pairs (n=10) for most but not all emotional expressions.ConclusionsSince these twin pairs had minimal contact with each other prior to testing, these results support significant genetic effects on the facial display of at least some human emotions in response to standardized stimuli. The small sample size resulted in estimated twin correlations with very wide confidence intervals.


2012 ◽  
Vol 60 (4) ◽  
pp. 419-429 ◽  
Author(s):  
Brian A. Silvey

The purpose of this study was to explore whether conductor facial expression affected the expressivity ratings assigned to music excerpts by high school band students. Three actors were videotaped while portraying approving, neutral, and disapproving facial expressions. Each video was duplicated twice and then synchronized with one of three professional wind ensemble recordings. Participants ( N = 133) viewed nine 1-min videos of varying facial expressions, actors, and excerpts and rated each ensemble’s expressivity on a 10-point rating scale. Results of a one-way repeated measures ANOVA indicated that conductor facial expression significantly affected ratings of ensemble expressivity ( p < .001, partial η2 = .15). Post hoc comparisons revealed that participants’ ensemble expressivity ratings were significantly higher for excerpts featuring approving facial expressions than for either neutral or disapproving expressions. Participants’ mean ratings were lowest for neutral facial expression excerpts, indicating that an absence of facial affect influenced evaluations of ensemble expressivity most negatively.


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2021 ◽  
Vol 14 (4) ◽  
pp. 4-22
Author(s):  
O.A. Korolkova ◽  
E.A. Lobodinskaya

In an experimental study, we explored the role of the natural or artificial character of expression and the speed of its exposure in the recognition of emotional facial expressions during stroboscopic presentation. In Series 1, participants identified emotions represented as sequences of frames from a video of a natural facial expression; in Series 2 participants were shown sequences of linear morph images. The exposure speed was varied. The results showed that at any exposure speed, the expressions of happiness and disgust were recognized most accurately. Longer presentation increased the accuracy of assessments of happiness, disgust, and surprise. Expression of surprise, demonstrated as a linear transformation, was recognized more efficiently than frames of natural expression of surprise. Happiness was perceived more accurately on video frames. The accuracy of the disgust recognition did not depend on the type of images. The qualitative nature of the stimuli and the speed of their presentation did not affect the accuracy of sadness recognition. The categorical structure of the perception of expressions was stable in any type of exposed images. The obtained results suggest a qualitative difference in the perception of natural and artificial images of expressions, which can be observed under extreme exposure conditions.


Author(s):  
Xia Fang ◽  
Disa Sauter ◽  
Marc Heerdink ◽  
Gerben van Kleef

There is a growing consensus that culture influences the perception of facial expressions of emotion. However, little is known about whether and how culture shapes the production of emotional facial expressions, and even less so about whether culture differentially shapes the production of posed versus spontaneous expressions. Drawing on prior work on cultural differences in emotional communication, we tested the prediction that people from the Netherlands (a historically heterogeneous culture where people are prone to low-context communication) produce facial expressions that are more distinct across emotions compared to people from China (a historically homogeneous culture where people are prone to high-context communication). Furthermore, we examined whether the degree of distinctiveness varies across posed and spontaneous expressions. Dutch and Chinese participants were instructed to either pose facial expressions of anger and disgust, or to share autobiographical events that elicited spontaneous expressions of anger or disgust. Using the complementary approaches of supervised machine learning and information-theoretic analysis of facial muscle movements, we show that posed and spontaneous facial expressions of anger and disgust were more distinct when produced by Dutch compared to Chinese participants. These findings shed new light on the role of culture in emotional communication by demonstrating, for the first time, effects on the distinctiveness of production of facial expressions.


Sign in / Sign up

Export Citation Format

Share Document