scholarly journals An open-source and wearable system for measuring 3D human motion in real-time

2021 ◽  
Author(s):  
Patrick Slade ◽  
Ayman Habib ◽  
Jennifer L. Hicks ◽  
Scott L. Delp

AbstractAnalyzing human motion is essential for diagnosing movement disorders and guiding rehabilitation interventions for conditions such as osteoarthritis, stroke, and Parkinson’s disease. Optical motion capture systems are the current standard for estimating kinematics but require expensive equipment located in a predefined space. While wearable sensor systems can estimate kinematics in any environment, existing systems are generally less accurate than optical motion capture. Further, many wearable sensor systems require a computer in close proximity and rely on proprietary software, making it difficult for researchers to reproduce experimental findings. Here, we present OpenSenseRT, an open-source and wearable system that estimates upper and lower extremity kinematics in real time by using inertial measurement units and a portable microcontroller. We compared the OpenSenseRT system to optical motion capture and found an average RMSE of 4.4 degrees across 5 lower-limb joint angles during three minutes of walking (n = 5) and an average RMSE of 5.6 degrees across 8 upper extremity joint angles during a Fugl-Meyer task (n = 5). The open-source software and hardware are scalable, tracking between 1 and 14 body segments, with one sensor per segment. Kinematics are estimated in real-time using a musculoskeletal model and inverse kinematics solver. The computation frequency, depends on the number of tracked segments, but is sufficient for real-time measurement for many tasks of interest; for example, the system can track up to 7 segments at 30 Hz in real-time. The system uses off-the-shelf parts costing approximately $100 USD plus $20 for each tracked segment. The OpenSenseRT system is accurate, low-cost, and simple to replicate, enabling movement analysis in labs, clinics, homes, and free-living settings.

2021 ◽  
Author(s):  
Mazen Al Borno ◽  
Johanna O'Day ◽  
Vanessa Ibarra ◽  
James Dunne ◽  
Ajay Seth ◽  
...  

Background: The ability to measure joint kinematics in natural environments over long durations using inertial measurement units (IMUs) could enable at-home monitoring and personalized treatment of neurological and musculoskeletal disorders. However, drift, or the accumulation of error over time, inhibits the accurate measurement of movement over long durations. We sought to develop an open-source workflow to estimate lower extremity joint kinematics from IMU data that was accurate, and capable of assessing and mitigating drift. Methods: We computed IMU-based estimates of kinematics using sensor fusion and an inverse kinematics approach with a constrained biomechanical model. We measured kinematics for 11 subjects as they performed two 10-minute trials: walking and a repeated sequence of varied lower-extremity movements. To validate the approach, we compared the joint angles computed with IMU orientations to the joint angles computed from optical motion capture using root mean square (RMS) difference and Pearson correlations, and estimated drift using a linear regression on each subject's RMS differences over time. Results: IMU-based kinematic estimates agreed with optical motion capture; median RMS differences over all subjects and all minutes were between 3-6 degrees for all joint angles except hip rotation and correlation coefficients were moderate to strong (r = 0.60 to 0.87). We observed minimal drift in the RMS differences over ten minutes; the average slopes of the linear fits to these data were near zero (-0.14 to 0.17 deg/min). Conclusions: Our workflow produced joint kinematics consistent with those estimated by optical motion capture, and could mitigate kinematic drift even in the trials of continuous walking without rest, obviating the need for explicit sensor recalibration (e.g. sitting or standing still for a few seconds or zero-velocity updates) used in current drift-mitigation approaches. This could enable long-duration measurements, bringing the field one step closer to estimating kinematics in natural environments.


1999 ◽  
Vol 8 (2) ◽  
pp. 187-203 ◽  
Author(s):  
Tom Molet ◽  
Ronan Boulic ◽  
Daniel Thalmann

Motion-capture techniques are rarely based on orientation measurements for two main reasons: (1) optical motion-capture systems are designed for tracking object position rather than their orientation (which can be deduced from several trackers), (2) known animation techniques, like inverse kinematics or geometric algorithms, require position targets constantly, but orientation inputs only occasionally. We propose a complete human motion-capture technique based essentially on orientation measurements. The position measurement is used only for recovering the global position of the performer. This method allows fast tracking of human gestures for interactive applications as well as high rate recording. Several motion-capture optimizations, including the multijoint technique, improve the posture realism. This work is well suited for magnetic-based systems that rely more on orientation registration (in our environment) than position measurements that necessitate difficult system calibration.


Diagnostics ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 426
Author(s):  
I. Concepción Aranda-Valera ◽  
Antonio Cuesta-Vargas ◽  
Juan L. Garrido-Castro ◽  
Philip V. Gardiner ◽  
Clementina López-Medina ◽  
...  

Portable inertial measurement units (IMUs) are beginning to be used in human motion analysis. These devices can be useful for the evaluation of spinal mobility in individuals with axial spondyloarthritis (axSpA). The objectives of this study were to assess (a) concurrent criterion validity in individuals with axSpA by comparing spinal mobility measured by an IMU sensor-based system vs. optical motion capture as the reference standard; (b) discriminant validity comparing mobility with healthy volunteers; (c) construct validity by comparing mobility results with relevant outcome measures. A total of 70 participants with axSpA and 20 healthy controls were included. Individuals with axSpA completed function and activity questionnaires, and their mobility was measured using conventional metrology for axSpA, an optical motion capture system, and an IMU sensor-based system. The UCOASMI, a metrology index based on measures obtained by motion capture, and the IUCOASMI, the same index using IMU measures, were also calculated. Descriptive and inferential analyses were conducted to show the relationships between outcome measures. There was excellent agreement (ICC > 0.90) between both systems and a significant correlation between the IUCOASMI and conventional metrology (r = 0.91), activity (r = 0.40), function (r = 0.62), quality of life (r = 0.55) and structural change (r = 0.76). This study demonstrates the validity of an IMU system to evaluate spinal mobility in axSpA. These systems are more feasible than optical motion capture systems, and they could be useful in clinical practice.


Author(s):  
David Lunardi Flam ◽  
Daniel Pacheco de Queiroz ◽  
Thatyene Louise Alves de Souza Ramos ◽  
Arnaldo de Albuquerque Araujo ◽  
Joao Victor Boechat Gomide

Sensors ◽  
2020 ◽  
Vol 20 (23) ◽  
pp. 6933
Author(s):  
Georgios Giarmatzis ◽  
Evangelia I. Zacharaki ◽  
Konstantinos Moustakas

Conventional biomechanical modelling approaches involve the solution of large systems of equations that encode the complex mathematical representation of human motion and skeletal structure. To improve stability and computational speed, being a common bottleneck in current approaches, we apply machine learning to train surrogate models and to predict in near real-time, previously calculated medial and lateral knee contact forces (KCFs) of 54 young and elderly participants during treadmill walking in a speed range of 3 to 7 km/h. Predictions are obtained by fusing optical motion capture and musculoskeletal modeling-derived kinematic and force variables, into regression models using artificial neural networks (ANNs) and support vector regression (SVR). Training schemes included either data from all subjects (LeaveTrialsOut) or only from a portion of them (LeaveSubjectsOut), in combination with inclusion of ground reaction forces (GRFs) in the dataset or not. Results identify ANNs as the best-performing predictor of KCFs, both in terms of Pearson R (0.89–0.98 for LeaveTrialsOut and 0.45–0.85 for LeaveSubjectsOut) and percentage normalized root mean square error (0.67–2.35 for LeaveTrialsOut and 1.6–5.39 for LeaveSubjectsOut). When GRFs were omitted from the dataset, no substantial decrease in prediction power of both models was observed. Our findings showcase the strength of ANNs to predict simultaneously multi-component KCF during walking at different speeds—even in the absence of GRFs—particularly applicable in real-time applications that make use of knee loading conditions to guide and treat patients.


2019 ◽  
Vol 13 (4) ◽  
pp. 506-516 ◽  
Author(s):  
Tsubasa Maruyama ◽  
Mitsunori Tada ◽  
Haruki Toda ◽  
◽  

The measurement of human motion is an important aspect of ergonomic mobility design, in which the mobility product is evaluated based on human factors obtained by digital human (DH) technologies. The optical motion-capture (MoCap) system has been widely used for measuring human motion in laboratories. However, it is generally difficult to measure human motion using mobility products in real-world scenarios, e.g., riding a bicycle on an outdoor slope, owing to unstable lighting conditions and camera arrangements. On the other hand, the inertial-measurement-unit (IMU)-based MoCap system does not require any optical devices, providing the potential for measuring riding motion even in outdoor environments. However, in general, the estimated motion is not necessarily accurate as there are many errors due to the nature of the IMU itself, such as drift and calibration errors. Thus, it is infeasible to apply the IMU-based system to riding motion estimation. In this study, we develop a new riding MoCap system using IMUs. The proposed system estimates product and human riding motions by combining the IMU orientation with contact constraints between the product and DH, e.g., DH hands in contact with handles. The proposed system is demonstrated with a bicycle ergometer, including the handles, seat, backrest, and foot pedals, as in general mobility products. The proposed system is further validated by comparing the estimated joint angles and positions with those of the optical MoCap for three different subjects. The experiment reveals both the effectiveness and limitations of the proposed system. It is confirmed that the proposed system improves the joint position estimation accuracy compared with a system using only IMUs. The angle estimation accuracy is also improved for near joints. However, it is observed that the angle accuracy decreases for a few joints. This is explained by the fact that the proposed system modifies the orientations of all body segments to satisfy the contact constraints, even if the orientations of a few joints are correct. This further confirms that the elapsed time using the proposed system is sufficient for real-time application.


2017 ◽  
Vol 33 (6-8) ◽  
pp. 993-1003 ◽  
Author(s):  
Shihong Xia ◽  
Le Su ◽  
Xinyu Fei ◽  
Han Wang

Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3748
Author(s):  
Leticia González ◽  
Juan C. Álvarez ◽  
Antonio M. López ◽  
Diego Álvarez

In the context of human–robot collaborative shared environments, there has been an increase in the use of optical motion capture (OMC) systems for human motion tracking. The accuracy and precision of OMC technology need to be assessed in order to ensure safe human–robot interactions, but the accuracy specifications provided by manufacturers are easily influenced by various factors affecting the measurements. This article describes a new methodology for the metrological evaluation of a human–robot collaborative environment based on optical motion capture (OMC) systems. Inspired by the ASTM E3064 test guide, and taking advantage of an existing industrial robot in the production cell, the system is evaluated for mean error, error spread, and repeatability. A detailed statistical study of the error distribution across the capture area is carried out, supported by a Mann–Whitney U-test for median comparisons. Based on the results, optimal capture areas for the use of the capture system are suggested. The results of the proposed method show that the metrological characteristics obtained are compatible and comparable in quality to other methods that do not require the intervention of an industrial robot.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3145
Author(s):  
Jan P. Vox ◽  
Anika Weber ◽  
Karen Insa Wolf ◽  
Krzysztof Izdebski ◽  
Thomas Schüler ◽  
...  

The reproduction and simulation of workplaces, and the analysis of body postures during work processes, are parts of ergonomic risk assessments. A commercial virtual reality (VR) system offers the possibility to model complex work scenarios as virtual mock-ups and to evaluate their ergonomic designs by analyzing motion behavior while performing work processes. In this study a VR tracking sensor system (HTC Vive tracker) combined with an inverse kinematic model (Final IK) was compared with a marker-based optical motion capture system (Qualisys). Marker-based optical motion capture systems are considered the gold standard for motion analysis. Therefore, Qualisys was used as the ground truth in this study. The research question to be answered was how accurately the HTC Vive System combined with Final IK can measure joint angles used for ergonomic evaluation. Twenty-six subjects were observed simultaneously with both tracking systems while performing 20 defined movements. Sixteen joint angles were analyzed. Joint angle deviations between ±6∘ and ±42∘ were identified. These high deviations must be considered in ergonomic risk assessments when using a VR system. The results show that commercial low-budget tracking systems have the potential to map joint angles. Nevertheless, substantial weaknesses and inaccuracies in some body regions must be taken into account. Recommendations are provided to improve tracking accuracy and avoid systematic errors.


Sign in / Sign up

Export Citation Format

Share Document