Kinematically Admissible Editing of the Measured Sensor Motion Data for Virtual Reconstruction of Plausible Human Movements

Author(s):  
Adithya Balasubramanyam ◽  
Ashok Kumar Patil ◽  
Bharatesh Chakravarthi ◽  
Jaeyeong Ryu ◽  
Young Ho Chai
2017 ◽  
Vol 13 (2) ◽  
pp. 155014771769608 ◽  
Author(s):  
Yejin Kim

Dynamic human movements such as dance are difficult to capture without using external markers due to the high complexity of a dancer’s body. This article introduces a marker-free motion capture and composition system for dance motion that uses multiple RGB and depth sensors. Our motion capture system utilizes a set of high-speed RGB and depth sensors to generate skeletal motion data from an expert dancer. During the motion acquisition process, a skeleton tracking method based on a particle filter is provided to estimate the motion parameters for each frame from a sequence of color images and depth features retrieved from the sensors. The expert motion data become archived in a database. The authoring methods in our composition system automate most of the motion editing processes for general users by providing an online motion search with an input posture and then performing motion synthesis on an arbitrary motion path. Using the proposed system, we demonstrate that various dance performances can be composed in an intuitive and efficient way on client devices such as tablets and kiosk PCs.


2018 ◽  
Vol 7 (3.34) ◽  
pp. 521
Author(s):  
Yejin Kim ◽  
. .

Background/Objectives: Human movements in dance are difficult to train without taking an actual class. In this paper, an interactive system of dance guidance is proposed to teach dance motions using examples.Methods/Statistical analysis: In the proposed system, a set of example motions are captured from experts through a method of marker-free motion capture, which consists of multiple Kinect cameras. The captured motions are calibrated and optimally reconstructed into a motion database. For the efficient exchange of motion data between a student and an instructor, a posture-based motion search and multi-mode views are provided for online lessons.Findings: To capture accurate example motions, the proposed system solves the joint occlusion problem by using multiple Kinect cameras. An iterative closest point (ICP) method is used to unify the multiple camera data into the same coordinate system, which generates an output motion in real time. Comparing to a commercial system, our system can capture various dance motions over an average of 85% accuracy, as shown in the experimental results. Using the touch screen devices, a student can browse a desired motion from the database to start a dance practice and send own motion to an instructor for feedback. By conducting online dance lessons such as ballet, K-pop, and traditional Korean, our experimental results show that the participating students can train their dance skills over a given period.Improvements/Applications: Our system is applicable to any student who wants to learn dance motions without taking an actual class andto receive online feedback from a distant instructor.  


Author(s):  
JIBUM JUNG Et.al

Development of wearable robots is accelerating. Walking robots mimic human behavior and must operate without accidents. Human motion data are needed to train these robots. We developed a system for extracting human motion data and displaying them graphically.We extracted motion data using a Perception Neuron motion capture system and used the Unity engine for the simulation. Several experiments were performed to demonstrate the accuracy of the extracted motion data.Of the various methods used to collect human motion data, markerless motion capture is highly inaccurate, while optical motion capture is very expensive, requiring several high-resolution cameras and a large number of markers. Motion capture using a magnetic field sensor is subject to environmental interference. Therefore, we used an inertial motion capture system. Each movement sequence involved four and was repeated 10 times. The data were stored and standardized. The motions of three individuals were compared to those of a reference person; the similarity exceeded 90% in all cases. Our rehabilitation robot accurately simulated human movements: individually tailored wearable robots could be designed based on our data. Safe and stable robot operation can be verified in advance via simulation. Walking stability can be increased using walking robots trained via machine learning algorithms.


2011 ◽  
Vol 131 (3) ◽  
pp. 267-274 ◽  
Author(s):  
Noboru Tsunashima ◽  
Yuki Yokokura ◽  
Seiichiro Katsura

1988 ◽  
Author(s):  
Kenneth W. Campbell ◽  
Sylvester Theodore Algermissen

Sign in / Sign up

Export Citation Format

Share Document