Continuous recognition of one-handed and two-handed gestures using 3D full-body motion tracking sensors

Author(s):  
Per Ola Kristensson ◽  
Thomas Nicholson ◽  
Aaron Quigley
Author(s):  
Simon Biggs

This paper discusses the immersive full body motion tracking installation Dark Matter, developed by the author and completed in early 2016. The paper outlines the conceptual focus of the project, including the use of the metaphor of dark matter to explore questions around interactive systems and assemblage. The primary technical considerations involved in the project are also outlined. ‘Co-reading' is proposed as a framework for a generative ontology, within the context of assemblage theory, deployed within a multimodal multi-agent interactive system.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245717
Author(s):  
Shlomi Haar ◽  
Guhan Sundar ◽  
A. Aldo Faisal

Motor-learning literature focuses on simple laboratory-tasks due to their controlled manner and the ease to apply manipulations to induce learning and adaptation. Recently, we introduced a billiards paradigm and demonstrated the feasibility of real-world-neuroscience using wearables for naturalistic full-body motion-tracking and mobile-brain-imaging. Here we developed an embodied virtual-reality (VR) environment to our real-world billiards paradigm, which allows to control the visual feedback for this complex real-world task, while maintaining sense of embodiment. The setup was validated by comparing real-world ball trajectories with the trajectories of the virtual balls, calculated by the physics engine. We then ran our short-term motor learning protocol in the embodied VR. Subjects played billiard shots when they held the physical cue and hit a physical ball on the table while seeing it all in VR. We found comparable short-term motor learning trends in the embodied VR to those we previously reported in the physical real-world task. Embodied VR can be used for learning real-world tasks in a highly controlled environment which enables applying visual manipulations, common in laboratory-tasks and rehabilitation, to a real-world full-body task. Embodied VR enables to manipulate feedback and apply perturbations to isolate and assess interactions between specific motor-learning components, thus enabling addressing the current questions of motor-learning in real-world tasks. Such a setup can potentially be used for rehabilitation, where VR is gaining popularity but the transfer to the real-world is currently limited, presumably, due to the lack of embodiment.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2108
Author(s):  
Maik Boltes ◽  
Juliane Adrian ◽  
Anna-Katharina Raytarowski

For our understanding of the dynamics inside crowds, reliable empirical data are needed, which could enable increases in safety and comfort for pedestrians and the design of models reflecting the real dynamics. A well-calibrated camera system can extract absolute head position with high accuracy. The inclusion of inertial sensors or even self-contained full-body motion capturing systems allows the relative tracking of invisible people or body parts or capturing the locomotion of the whole body even in dense crowds. The newly introduced hybrid system maps the trajectory of the top of the head coming from a full-body motion tracking system to the head trajectory of a camera system in global space. The fused data enable the analysis of possible correlations of all observables. In this paper we present an experiment of people passing though a bottleneck and show by example the influences of bottleneck width and motivation on the overall movement, velocity, stepping locomotion and rotation of the pelvis. The hybrid tracking system opens up new possibilities for analyzing pedestrian dynamics inside crowds, such as the space requirement while passing through a bottleneck. The system allows linking any body motion to characteristics describing the situation of a person inside a crowd, such as the density or movements of other participants nearby.


Author(s):  
Shlomi Haar ◽  
Guhan Sundar ◽  
A. Aldo Faisal

AbstractMotor-learning literature focuses on simple laboratory-tasks due to their controlled manner and the ease to apply manipulations to induce learning and adaptation. Recently, we introduced a billiards paradigm and demonstrated the feasibility of real-world-neuroscience using wearables for naturalistic full-body motion-tracking and mobile-brain-imaging. Here we developed an embodied virtual-reality (VR) environment to our real-world billiards paradigm, which allows to control the visual feedback for this complex real-world task, while maintaining sense of embodiment. The setup was validated by comparing real-world ball trajectories with the trajectories of the virtual balls, calculated by the physics engine. We then ran our learning protocol in the embodied VR. Subjects played billiard shots when they held the physical cue and hit a physical ball on the table while seeing it all in VR. We found comparable learning trends in the embodied VR to those we previously reported in the physical real-world task. Embodied VR can be used for learning real-world tasks in a highly controlled environment which enables applying visual manipulations, common in laboratory-tasks and rehabilitation, to a real-world full-body task. Embodied VR enables to manipulate feedback and apply perturbations to isolate and assess interactions between specific motor-learning components, thus enabling addressing the current questions of motor-learning in real-world tasks. Such a setup can be used for rehabilitation, where VR is gaining popularity but the transfer to the real-world is currently limited, presumably, due to the lack of embodiment.


2020 ◽  
Author(s):  
Karl K. Kopiske ◽  
Daniel Koska ◽  
Thomas Baumann ◽  
Christian Maiwald ◽  
Wolfgang Einhäuser

Most humans can walk effortlessly across uniform terrain even without paying much attention to it. However, most natural terrain is far from uniform, and we need visual information to maintain stable gait. In a controlled yet naturalistic environment, we simulated terrain difficulty through slip-like perturbations that were either unpredictable (experiment 1) or sometimes followed visual cues (experiment 2) while recording eye and body movements using mobile eye tracking and full-body motion tracking. We quantified the distinct roles of eye and head movements for adjusting gaze on different time scales. While motor perturbations mainly influenced head movements, eye movements were primarily affected by visual cues, both immediately following slips, and – to a lesser extent – over 5-minute blocks. We find adapted gaze parameters already after the first perturbation in each block, with little transfer between blocks. In conclusion, gaze-gait interactions in experimentally perturbed naturalistic walking are adaptive, flexible, and effector-specific.


Sign in / Sign up

Export Citation Format

Share Document