scholarly journals DeepBBWAE-Net: A CNN-RNN Based Deep SuperLearner For Estimating Lower Extremity Sagittal Plane Joint Kinematics Using Shoe-Mounted IMU Sensors In Daily Living

Author(s):  
Md Sanzid Bin Hossain ◽  
Joseph Drantez ◽  
Hwan Choi ◽  
Zhishan Guo

<div>Measurement of human body movement is an essential step in biomechanical analysis. The current standard for human motion capture systems uses infrared cameras to track reflective markers placed on the subject. While these systems can accurately track joint kinematics, the analyses are spatially limited to the lab environment. Though Inertial Measurement Unit (IMU) can eliminate the spatial limitations of the motion capture system, those systems are impractical for use in daily living due to the need for many sensors, typically one per body segment. Due to the need for practical and accurate estimation of joint kinematics, this study implements a reduced number of IMU sensors and employs machine learning algorithm to map sensor data to joint angles. Our developed algorithm estimates hip, knee, and ankle angles in the sagittal plane using two shoe-mounted IMU sensors in different practical walking conditions: treadmill, level overground, stair, and slope conditions. Specifically, we proposed five deep learning networks that use combinations of Convolutional Neural Networks (CNN) and Gated Recurrent Unit (GRU) based Recurrent Neural Networks (RNN) as base learners for our framework. Using those five baseline models, we proposed a novel framework, DeepBBWAE-Net, that implements ensemble techniques such as bagging, boosting, and weighted averaging to improve kinematic predictions. DeepBBWAE-Net predicts joint kinematics for the three joint angles under all the walking conditions with a Root Mean Square Error (RMSE) 6.93-29.0% lower than base models individually. This is the first study that uses a reduced number of IMU sensors to estimate kinematics in multiple walking environments.</div>

2021 ◽  
Author(s):  
Md Sanzid Bin Hossain ◽  
Joseph Drantez ◽  
Hwan Choi ◽  
Zhishan Guo

<div>Measurement of human body movement is an essential step in biomechanical analysis. The current standard for human motion capture systems uses infrared cameras to track reflective markers placed on the subject. While these systems can accurately track joint kinematics, the analyses are spatially limited to the lab environment. Though Inertial Measurement Unit (IMU) can eliminate the spatial limitations of the motion capture system, those systems are impractical for use in daily living due to the need for many sensors, typically one per body segment. Due to the need for practical and accurate estimation of joint kinematics, this study implements a reduced number of IMU sensors and employs machine learning algorithm to map sensor data to joint angles. Our developed algorithm estimates hip, knee, and ankle angles in the sagittal plane using two shoe-mounted IMU sensors in different practical walking conditions: treadmill, level overground, stair, and slope conditions. Specifically, we proposed five deep learning networks that use combinations of Convolutional Neural Networks (CNN) and Gated Recurrent Unit (GRU) based Recurrent Neural Networks (RNN) as base learners for our framework. Using those five baseline models, we proposed a novel framework, DeepBBWAE-Net, that implements ensemble techniques such as bagging, boosting, and weighted averaging to improve kinematic predictions. DeepBBWAE-Net predicts joint kinematics for the three joint angles under all the walking conditions with a Root Mean Square Error (RMSE) 6.93-29.0% lower than base models individually. This is the first study that uses a reduced number of IMU sensors to estimate kinematics in multiple walking environments.</div>


Diagnostics ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 426
Author(s):  
I. Concepción Aranda-Valera ◽  
Antonio Cuesta-Vargas ◽  
Juan L. Garrido-Castro ◽  
Philip V. Gardiner ◽  
Clementina López-Medina ◽  
...  

Portable inertial measurement units (IMUs) are beginning to be used in human motion analysis. These devices can be useful for the evaluation of spinal mobility in individuals with axial spondyloarthritis (axSpA). The objectives of this study were to assess (a) concurrent criterion validity in individuals with axSpA by comparing spinal mobility measured by an IMU sensor-based system vs. optical motion capture as the reference standard; (b) discriminant validity comparing mobility with healthy volunteers; (c) construct validity by comparing mobility results with relevant outcome measures. A total of 70 participants with axSpA and 20 healthy controls were included. Individuals with axSpA completed function and activity questionnaires, and their mobility was measured using conventional metrology for axSpA, an optical motion capture system, and an IMU sensor-based system. The UCOASMI, a metrology index based on measures obtained by motion capture, and the IUCOASMI, the same index using IMU measures, were also calculated. Descriptive and inferential analyses were conducted to show the relationships between outcome measures. There was excellent agreement (ICC > 0.90) between both systems and a significant correlation between the IUCOASMI and conventional metrology (r = 0.91), activity (r = 0.40), function (r = 0.62), quality of life (r = 0.55) and structural change (r = 0.76). This study demonstrates the validity of an IMU system to evaluate spinal mobility in axSpA. These systems are more feasible than optical motion capture systems, and they could be useful in clinical practice.


2019 ◽  
Vol 13 (4) ◽  
pp. 506-516 ◽  
Author(s):  
Tsubasa Maruyama ◽  
Mitsunori Tada ◽  
Haruki Toda ◽  
◽  

The measurement of human motion is an important aspect of ergonomic mobility design, in which the mobility product is evaluated based on human factors obtained by digital human (DH) technologies. The optical motion-capture (MoCap) system has been widely used for measuring human motion in laboratories. However, it is generally difficult to measure human motion using mobility products in real-world scenarios, e.g., riding a bicycle on an outdoor slope, owing to unstable lighting conditions and camera arrangements. On the other hand, the inertial-measurement-unit (IMU)-based MoCap system does not require any optical devices, providing the potential for measuring riding motion even in outdoor environments. However, in general, the estimated motion is not necessarily accurate as there are many errors due to the nature of the IMU itself, such as drift and calibration errors. Thus, it is infeasible to apply the IMU-based system to riding motion estimation. In this study, we develop a new riding MoCap system using IMUs. The proposed system estimates product and human riding motions by combining the IMU orientation with contact constraints between the product and DH, e.g., DH hands in contact with handles. The proposed system is demonstrated with a bicycle ergometer, including the handles, seat, backrest, and foot pedals, as in general mobility products. The proposed system is further validated by comparing the estimated joint angles and positions with those of the optical MoCap for three different subjects. The experiment reveals both the effectiveness and limitations of the proposed system. It is confirmed that the proposed system improves the joint position estimation accuracy compared with a system using only IMUs. The angle estimation accuracy is also improved for near joints. However, it is observed that the angle accuracy decreases for a few joints. This is explained by the fact that the proposed system modifies the orientations of all body segments to satisfy the contact constraints, even if the orientations of a few joints are correct. This further confirms that the elapsed time using the proposed system is sufficient for real-time application.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4240
Author(s):  
Byong Hun Kim ◽  
Sung Hyun Hong ◽  
In Wook Oh ◽  
Yang Woo Lee ◽  
In Ho Kee ◽  
...  

Gait analysis has historically been implemented in laboratory settings only with expensive instruments; yet, recently, efforts to develop and integrate wearable sensors into clinical applications have been made. A limited number of previous studies have been conducted to validate inertial measurement units (IMUs) for measuring ankle joint kinematics, especially with small movement ranges. Therefore, the purpose of this study was to validate the ability of available IMUs to accurately measure the ankle joint angles by comparing the ankle joint angles measured using a wearable device with those obtained using a motion capture system during running. Ten healthy subjects participated in the study. The intraclass correlation coefficient (ICC) and standard error of measurement were calculated for reliability, whereas the Pearson coefficient correlation was performed for validity. The results showed that the day-to-day reliability was excellent (0.974 and 0.900 for sagittal and frontal plane, respectively), and the validity was good in both sagittal (r = 0.821, p < 0.001) and frontal (r = 0.835, p < 0.001) planes for ankle joints. In conclusion, we suggest that the developed device could be used as an alternative tool for the 3D motion capture system for assessing ankle joint kinematics.


2021 ◽  
Author(s):  
Mazen Al Borno ◽  
Johanna O'Day ◽  
Vanessa Ibarra ◽  
James Dunne ◽  
Ajay Seth ◽  
...  

Background: The ability to measure joint kinematics in natural environments over long durations using inertial measurement units (IMUs) could enable at-home monitoring and personalized treatment of neurological and musculoskeletal disorders. However, drift, or the accumulation of error over time, inhibits the accurate measurement of movement over long durations. We sought to develop an open-source workflow to estimate lower extremity joint kinematics from IMU data that was accurate, and capable of assessing and mitigating drift. Methods: We computed IMU-based estimates of kinematics using sensor fusion and an inverse kinematics approach with a constrained biomechanical model. We measured kinematics for 11 subjects as they performed two 10-minute trials: walking and a repeated sequence of varied lower-extremity movements. To validate the approach, we compared the joint angles computed with IMU orientations to the joint angles computed from optical motion capture using root mean square (RMS) difference and Pearson correlations, and estimated drift using a linear regression on each subject's RMS differences over time. Results: IMU-based kinematic estimates agreed with optical motion capture; median RMS differences over all subjects and all minutes were between 3-6 degrees for all joint angles except hip rotation and correlation coefficients were moderate to strong (r = 0.60 to 0.87). We observed minimal drift in the RMS differences over ten minutes; the average slopes of the linear fits to these data were near zero (-0.14 to 0.17 deg/min). Conclusions: Our workflow produced joint kinematics consistent with those estimated by optical motion capture, and could mitigate kinematic drift even in the trials of continuous walking without rest, obviating the need for explicit sensor recalibration (e.g. sitting or standing still for a few seconds or zero-velocity updates) used in current drift-mitigation approaches. This could enable long-duration measurements, bringing the field one step closer to estimating kinematics in natural environments.


Author(s):  
Manuel Trinidad-Fernández ◽  
Antonio Cuesta-Vargas ◽  
Peter Vaes ◽  
David Beckwée ◽  
Francisco-Ángel Moreno ◽  
...  

AbstractA human motion capture system using an RGB-D camera could be a good option to understand the trunk limitations in spondyloarthritis. The aim of this study is to validate a human motion capture system using an RGB-D camera to analyse trunk movement limitations in spondyloarthritis patients. Cross-sectional study was performed where spondyloarthritis patients were diagnosed with a rheumatologist. The RGB-D camera analysed the kinematics of each participant during seven functional tasks based on rheumatologic assessment. The OpenNI2 library collected the depth data, the NiTE2 middleware detected a virtual skeleton and the MRPT library recorded the trunk positions. The gold standard was registered using an inertial measurement unit. The outcome variables were angular displacement, angular velocity and lineal acceleration of the trunk. Criterion validity and the reliability were calculated. Seventeen subjects (54.35 (11.75) years) were measured. The Bending task obtained moderate results in validity (r = 0.55–0.62) and successful results in reliability (ICC = 0.80–0.88) and validity and reliability of angular kinematic results in Chair task were moderate and (r = 0.60–0.74, ICC = 0.61–0.72). The kinematic results in Timed Up and Go test were less consistent. The RGB-D camera was documented to be a reliable tool to assess the movement limitations in spondyloarthritis depending on the functional tasks: Bending task. Chair task needs further research and the TUG analysis was not validated. Graphical abstract Comparation of both systems, required software for camera analysis, outcomes and final results of validity and reliability of each test.


Author(s):  
Kan Kanjanapas ◽  
Yizhou Wang ◽  
Wenlong Zhang ◽  
Lauren Whittingham ◽  
Masayoshi Tomizuka

A human motion capture system is becoming one of the most useful tools in rehabilitation application because it can record and reconstruct a patient’s motion accurately for motion analysis. In this paper, a human motion capture system is proposed based on inertial sensing. A microprocessor is implemented on-board to obtain raw sensing data from the inertial measurement unit (IMU), and transmit the raw data to the central processing unit. To reject noise in the accelerometer, drift in the gyroscope, and magnetic distortion in the magnetometer, a time-varying complementary filter (TVCF) is implemented in the central processing unit to provide accurate attitude estimation. A forward kinematic model of the human arm is developed to create an animation for patients and physical therapists. Performance of the hardware and filtering algorithm is verified by experimental results.


PeerJ ◽  
2019 ◽  
Vol 7 ◽  
pp. e6365 ◽  
Author(s):  
Kyle J. Boddy ◽  
Joseph A. Marsh ◽  
Alex Caravan ◽  
Kyle E. Lindley ◽  
John O. Scheffey ◽  
...  

Background Improvements in data processing, increased understanding of the biomechanical background behind kinetics and kinematics, and technological advancements in inertial measurement unit (IMU) sensors have enabled high precision in the measurement of joint angles and acceleration on human subjects. This has resulted in new devices that reportedly measure joint angles, arm speed, and stresses to the pitching arms of baseball players. This study seeks to validate one such sensor, the MotusBASEBALL unit, with a marker-based motion capture laboratory. Hypothesis We hypothesize that the joint angle measurements (“arm slot” and “shoulder rotation”) of the MotusBASEBALL device will hold a statistically significant level of reliability and accuracy, but that the “arm speed” and “stress” metrics will not be accurate due to limitations in IMU technology. Methods A total of 10 healthy subjects threw five to seven fastballs followed by five to seven breaking pitches (slider or curveball) in the motion capture lab. Subjects wore retroreflective markers and the MotusBASEBALL sensor simultaneously. Results It was found that the arm slot (R = 0.975, P < 0.001), shoulder rotation (R = 0.749, P < 0.001), and stress (R = 0.667, P = 0.001 when compared to elbow torque; R = 0.653, P = 0.002 when compared to shoulder torque) measurements were all significantly correlated with the results from the motion capture lab. Arm speed showed significant correlations to shoulder internal rotation speed (R = 0.668, P = 0.001) and shoulder velocity magnitude (R = 0.659, P = 0.002). For the entire sample, arm slot and shoulder rotation measurements were on a similar scale, or within 5–15% in absolute value, of magnitude to measurements from the motion capture test, averaging eight degrees less (12.9% relative differences) and nine degrees (5.4%) less, respectively. Arm speed had a much larger difference, averaging 3,745 deg/s (80.2%) lower than shoulder internal rotation velocity, and 3,891 deg/s (80.8%) less than the shoulder velocity magnitude. The stress metric was found to be 41 Newton meter (Nm; 38.7%) less when compared to elbow torque, and 42 Nm (39.3%) less when compared to shoulder torque. Despite the differences in magnitude, the correlations were extremely strong, indicating that the MotusBASEBALL sensor had high reliability for casual use. Conclusion This study attempts to validate the use of the MotusBASEBALL for future studies that look at the arm slot, shoulder rotation, arm speed, and stress measurements from the MotusBASEBALL sensor. Excepting elbow extension velocity, all metrics from the MotusBASEBALL unit showed significant correlations to their corresponding metrics from motion capture and while some magnitudes differ substantially and therefore fall short in validity, the link between the metrics is strong enough to indicate reliable casual use. Further research should be done to further investigate the validity and reliability of the arm speed metric.


2013 ◽  
Vol 711 ◽  
pp. 500-505 ◽  
Author(s):  
Song Shan Wang ◽  
Yan Qing Qi

This article gives a method about driving a virtual human by motion capture data in software Jack. Firstly simplify Jack's skeleton model according to the skeleton of capturing data BVH. Secondly set up an Euler angle rotation equation to mapping joint angles between BVH and Jack. Finally, program the method and give an example to show that it is available to improve Jacks human motion simulating by the human capturing data.


Sign in / Sign up

Export Citation Format

Share Document