scholarly journals Time perception in human movement: Effects of speed and agency on duration estimation

2020 ◽  
pp. 174702182097951
Author(s):  
Emma Allingham ◽  
David Hammerschmidt ◽  
Clemens Wöllner

While the effects of synthesised visual stimuli on time perception processes are well documented, very little research on time estimation in human movement stimuli exists. This study investigated the effects of movement speed and agency on duration estimation of human motion. Participants were recorded using optical motion capture while they performed dance-like movements at three different speeds. They later returned for a perceptual experiment in which they watched point-light displays of themselves and one other participant. Participants were asked to identify themselves, to estimate the duration of the recordings, and to rate expressivity and quality of the movements. Results indicate that speed of movement affected duration estimations such that faster speeds were rated longer, in accordance with previous findings in non-biological motion. The biasing effects of speed were stronger for watching others’ movements than for watching one’s own point-light movements. Duration estimations were longer after acting out the movement compared with watching it, and speed differentially affected ratings of expressivity and quality. Findings suggest that aspects of temporal processing of visual stimuli may be modulated by inner motor representations of previously performed movements, and by physically carrying out an action compared with just watching it. Results also support the inner clock and change theories of time perception for the processing of human motion stimuli, which can inform the temporal mechanisms of the hypothesised separate processor for human movement information.

2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2007 ◽  
Vol 04 (02) ◽  
pp. 365-385 ◽  
Author(s):  
ODEST CHADWICKE JENKINS ◽  
GERMÁN GONZÁLEZ SERRANO ◽  
MATTHEW M. LOPER

There is currently a division between real-world human performance and the decision making of socially interactive robots. This circumstance is partially due to the difficulty in estimating human cues, such as pose and gesture, from robot sensing. Towards bridging this division, we present a method for kinematic pose estimation and action recognition from monocular robot vision through the use of dynamical human motion vocabularies. Our notion of a motion vocabulary is comprised of movement primitives that structure a human's action space for decision making and predict human movement dynamics. Through prediction, such primitives can be used to both generate motor commands for specific actions and perceive humans performing those actions. In this paper, we focus specifically on the perception of human pose and performed actions using a known vocabulary of primitives. Given image observations over time, each primitive infers pose independently using its expected dynamics in the context of a particle filter. Pose estimates from a set of primitives inferencing in parallel are arbitrated to estimate the action being performed. The efficacy of our approach is demonstrated through interactive-time pose and action recognition over extended motion trials. Results evidence our approach requires small numbers of particles for tracking, is robust to unsegmented multi-action movement, movement speed, camera viewpoint and is able to recover from occlusions.


2021 ◽  
Author(s):  
◽  
Julie Anne Séguin

<p>Activation and attention have opposite effects on time perception. Emotion can both increase physiological activation (which leads to overestimation of time) and attract attention (which leads to underestimation of time). Although the effect of emotion on time perception has received a growing amount of attention, the use of different time estimation tasks and stimuli makes it difficult to compare findings across studies. The effect of emotion on the temporal perception of complex stimuli (e.g. scenes) is particularly under-researched. This thesis presents a systematic assessment of the effect of two key emotional dimensions, arousal and valence, on time perception for visual stimuli. Studies were designed to control for factors that may modulate emotion effects, such as image repetition and carry over from one emotion to another. The stimuli were complex images standardized for arousal (high or low) and valence (positive or negative) as well as neutral images. The first study compared three time estimation tasks to determine which were sensitive to emotion effects. The selected task, temporal bisection, was used to test time perception in three duration ranges: short (400 to 1600ms), middle (1000 to 4000ms), and long (2000 to 6000ms). Results of bisection point analyses revealed that the duration of attention-capturing stimuli (e.g. high arousal or negative images) was underestimated compared to that of other stimuli (e.g. low arousal or neutral images). These findings are at odds with activational effects of emotion (overestimation of emotional stimuli), which are typically found in studies of time perception for facial expression. Better temporal sensitivity in the long range than in short and middle ranges suggests that participants used different timing strategies to perform the bisection task at longer stimulus durations. To test the effect of emotion on time perception using a discrete rather than dimensional classification of emotion, experiments were replicated using emotional facial expressions as stimuli. Time estimates in the short and middle ranges did not show attentional effects, but pointed to activational effects of emotion. Facial expression had no impact on time perception in the long duration range. Taken together, these experiments show that the effect of emotion on time perception varies according to both duration and stimulus type. Emotional facial expressions have short lived activational effects whereby the duration of arousing stimuli is overestimated, whereas complex emotional scenes have protracted attentional effects through which the duration of attention-capturing stimuli is underestimated.</p>


2019 ◽  
Vol 38 (14) ◽  
pp. 1529-1537 ◽  
Author(s):  
Pauline Maurice ◽  
Adrien Malaisé ◽  
Clélie Amiot ◽  
Nicolas Paris ◽  
Guy-Junior Richard ◽  
...  

Improving work conditions in industry is a major challenge that can be addressed with new emerging technologies such as collaborative robots. Machine learning techniques can improve the performance of those robots, by endowing them with a degree of awareness of the human state and ergonomics condition. The availability of appropriate datasets to learn models and test prediction and control algorithms, however, remains an issue. This article presents a dataset of human motions in industry-like activities, fully labeled according to the ergonomics assessment worksheet EAWS, widely used in industries such as car manufacturing. Thirteen participants performed several series of activities, such as screwing and manipulating loads under different conditions, resulting in more than 5 hours of data. The dataset contains the participants’ whole-body kinematics recorded both with wearable inertial sensors and marker-based optical motion capture, finger pressure force, video recordings, and annotations by three independent annotators of the performed action and the adopted posture following the EAWS postural grid. Sensor data are available in different formats to facilitate their reuse. The dataset is intended for use by researchers developing algorithms for classifying, predicting, or evaluating human motion in industrial settings, as well as researchers developing collaborative robotics solutions that aim at improving the workers’ ergonomics. The annotation of the whole dataset following an ergonomics standard makes it valuable for ergonomics-related applications, but we expect its use to be broader in the robotics, machine learning, and human movement communities.


Animals ◽  
2019 ◽  
Vol 9 (9) ◽  
pp. 661 ◽  
Author(s):  
Carla J. Eatherington ◽  
Lieta Marinelli ◽  
Miina Lõoke ◽  
Luca Battaglini ◽  
Paolo Mongillo

Visual perception remains an understudied area of dog cognition, particularly the perception of biological motion where the small amount of previous research has created an unclear impression regarding dogs’ visual preference towards different types of point-light displays. To date, no thorough investigation has been conducted regarding which aspects of the motion contained in point-light displays attract dogs. To test this, pet dogs (N = 48) were presented with pairs of point-light displays with systematic manipulation of motion features (i.e., upright or inverted orientation, coherent or scrambled configuration, human or dog species). Results revealed a significant effect of inversion, with dogs directing significantly longer looking time towards upright than inverted dog point-light displays; no effect was found for scrambling or the scrambling-inversion interaction. No looking time bias was found when dogs were presented with human point-light displays, regardless of their orientation or configuration. The results of the current study imply that dogs’ visual preference is driven by the motion of individual dots in accordance with gravity, rather than the point-light display’s global arrangement, regardless their long exposure to human motion.


2016 ◽  
Author(s):  
Jill Schmidt ◽  
Devin R. Berg

In the field of biomechanics, optical motion tracking systems are commonly used to record human motion and assist in surgical navigation. Recently, motion tracking systems have been used to track implant and bone motion on a micron-level. The present study evaluated four different Optotrak® motion tracking systems to determine the precision, repeatability and accuracy under static testing conditions. The distance between the camera systems and the rigid body, as well as the tilt angle of the rigid body, did affect the resulting precision, repeatability and accuracy of the camera systems. The precision and repeatability, calculated as the within-trial and between-trial standard deviations, respectively, were less than 30 µm; with some configurations producing precision and repeatability less than 1 µm. The accuracy was less than 0.53% of the total displacement for the in-plane motion and less than 1.56% of the total displacement for the out-of-plane motion.


2020 ◽  
Author(s):  
G.V. Portnova ◽  
A. B. Rebreikina ◽  
O.V. Martynova

AbstractWe aimed to investigate the ability of children aged 5–14 years old (preschoolers, primary schoolers, and preteens) to assess and anticipate time intervals. 287 Russian children aged 5–14 years old and 26 adults of control group participated in our study. The neuropsychological assessment, Wechsler Intelligence Scale for Children and a battery of time-related tests were applied. All groups of children overestimated the event’s duration, although the accuracy of the second estimations increased among the participants aged 6–8 years after a prompt was offered. A zone of proximal development for time anticipation task was detected for children aged 9-11 years, when the prompt could significantly improve the accuracy of time perception. The participants overestimated the duration of both upcoming and past events, with the degree of overestimation being found to be negatively correlated with age. Further, a higher degree of accuracy in terms of time estimation was found to be correlated with higher scores on the attention and memory tests, and accuracy of time anticipation was associated with scores of praxis test.


1999 ◽  
Vol 8 (2) ◽  
pp. 187-203 ◽  
Author(s):  
Tom Molet ◽  
Ronan Boulic ◽  
Daniel Thalmann

Motion-capture techniques are rarely based on orientation measurements for two main reasons: (1) optical motion-capture systems are designed for tracking object position rather than their orientation (which can be deduced from several trackers), (2) known animation techniques, like inverse kinematics or geometric algorithms, require position targets constantly, but orientation inputs only occasionally. We propose a complete human motion-capture technique based essentially on orientation measurements. The position measurement is used only for recovering the global position of the performer. This method allows fast tracking of human gestures for interactive applications as well as high rate recording. Several motion-capture optimizations, including the multijoint technique, improve the posture realism. This work is well suited for magnetic-based systems that rely more on orientation registration (in our environment) than position measurements that necessitate difficult system calibration.


Diagnostics ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 426
Author(s):  
I. Concepción Aranda-Valera ◽  
Antonio Cuesta-Vargas ◽  
Juan L. Garrido-Castro ◽  
Philip V. Gardiner ◽  
Clementina López-Medina ◽  
...  

Portable inertial measurement units (IMUs) are beginning to be used in human motion analysis. These devices can be useful for the evaluation of spinal mobility in individuals with axial spondyloarthritis (axSpA). The objectives of this study were to assess (a) concurrent criterion validity in individuals with axSpA by comparing spinal mobility measured by an IMU sensor-based system vs. optical motion capture as the reference standard; (b) discriminant validity comparing mobility with healthy volunteers; (c) construct validity by comparing mobility results with relevant outcome measures. A total of 70 participants with axSpA and 20 healthy controls were included. Individuals with axSpA completed function and activity questionnaires, and their mobility was measured using conventional metrology for axSpA, an optical motion capture system, and an IMU sensor-based system. The UCOASMI, a metrology index based on measures obtained by motion capture, and the IUCOASMI, the same index using IMU measures, were also calculated. Descriptive and inferential analyses were conducted to show the relationships between outcome measures. There was excellent agreement (ICC > 0.90) between both systems and a significant correlation between the IUCOASMI and conventional metrology (r = 0.91), activity (r = 0.40), function (r = 0.62), quality of life (r = 0.55) and structural change (r = 0.76). This study demonstrates the validity of an IMU system to evaluate spinal mobility in axSpA. These systems are more feasible than optical motion capture systems, and they could be useful in clinical practice.


Sign in / Sign up

Export Citation Format

Share Document