scholarly journals Measuring Spinal Mobility Using an Inertial Measurement Unit System: A Validation Study in Axial Spondyloarthritis

Diagnostics ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 426
Author(s):  
I. Concepción Aranda-Valera ◽  
Antonio Cuesta-Vargas ◽  
Juan L. Garrido-Castro ◽  
Philip V. Gardiner ◽  
Clementina López-Medina ◽  
...  

Portable inertial measurement units (IMUs) are beginning to be used in human motion analysis. These devices can be useful for the evaluation of spinal mobility in individuals with axial spondyloarthritis (axSpA). The objectives of this study were to assess (a) concurrent criterion validity in individuals with axSpA by comparing spinal mobility measured by an IMU sensor-based system vs. optical motion capture as the reference standard; (b) discriminant validity comparing mobility with healthy volunteers; (c) construct validity by comparing mobility results with relevant outcome measures. A total of 70 participants with axSpA and 20 healthy controls were included. Individuals with axSpA completed function and activity questionnaires, and their mobility was measured using conventional metrology for axSpA, an optical motion capture system, and an IMU sensor-based system. The UCOASMI, a metrology index based on measures obtained by motion capture, and the IUCOASMI, the same index using IMU measures, were also calculated. Descriptive and inferential analyses were conducted to show the relationships between outcome measures. There was excellent agreement (ICC > 0.90) between both systems and a significant correlation between the IUCOASMI and conventional metrology (r = 0.91), activity (r = 0.40), function (r = 0.62), quality of life (r = 0.55) and structural change (r = 0.76). This study demonstrates the validity of an IMU system to evaluate spinal mobility in axSpA. These systems are more feasible than optical motion capture systems, and they could be useful in clinical practice.

2019 ◽  
Vol 13 (4) ◽  
pp. 506-516 ◽  
Author(s):  
Tsubasa Maruyama ◽  
Mitsunori Tada ◽  
Haruki Toda ◽  
◽  

The measurement of human motion is an important aspect of ergonomic mobility design, in which the mobility product is evaluated based on human factors obtained by digital human (DH) technologies. The optical motion-capture (MoCap) system has been widely used for measuring human motion in laboratories. However, it is generally difficult to measure human motion using mobility products in real-world scenarios, e.g., riding a bicycle on an outdoor slope, owing to unstable lighting conditions and camera arrangements. On the other hand, the inertial-measurement-unit (IMU)-based MoCap system does not require any optical devices, providing the potential for measuring riding motion even in outdoor environments. However, in general, the estimated motion is not necessarily accurate as there are many errors due to the nature of the IMU itself, such as drift and calibration errors. Thus, it is infeasible to apply the IMU-based system to riding motion estimation. In this study, we develop a new riding MoCap system using IMUs. The proposed system estimates product and human riding motions by combining the IMU orientation with contact constraints between the product and DH, e.g., DH hands in contact with handles. The proposed system is demonstrated with a bicycle ergometer, including the handles, seat, backrest, and foot pedals, as in general mobility products. The proposed system is further validated by comparing the estimated joint angles and positions with those of the optical MoCap for three different subjects. The experiment reveals both the effectiveness and limitations of the proposed system. It is confirmed that the proposed system improves the joint position estimation accuracy compared with a system using only IMUs. The angle estimation accuracy is also improved for near joints. However, it is observed that the angle accuracy decreases for a few joints. This is explained by the fact that the proposed system modifies the orientations of all body segments to satisfy the contact constraints, even if the orientations of a few joints are correct. This further confirms that the elapsed time using the proposed system is sufficient for real-time application.


Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 596 ◽  
Author(s):  
Nimsiri Abhayasinghe ◽  
Iain Murray ◽  
Shiva Sharif Bidabadi

Inertial measurement units are commonly used to estimate the orientation of sections of sections of human body in inertial navigation systems. Most of the algorithms used for orientation estimation are computationally expensive and it is difficult to implement them in real-time embedded systems with restricted capabilities. This paper discusses a computationally inexpensive orientation estimation algorithm (Gyro Integration-Based Orientation Filter—GIOF) that is used to estimate the forward and backward swing angle of the thigh (thigh angle) for a vision impaired navigation aid. The algorithm fuses the accelerometer and gyroscope readings to derive the single dimension orientation in such a way that the orientation is corrected using the accelerometer reading when it reads gravity only or otherwise integrate the gyro reading to estimate the orientation. This strategy was used to reduce the drift caused by the gyro integration. The thigh angle estimated by GIOF was compared against the Vicon Optical Motion Capture System and reported a mean correlation of 99.58% for 374 walking trials with a standard deviation of 0.34%. The Root Mean Square Error (RMSE) of the thigh angle estimated by GIOF compared with Vicon measurement was 1.8477°. The computation time on an 8-bit microcontroller running at 8 MHz for GIOF is about a half of that of Complementary Filter implementation. Although GIOF was only implemented and tested for estimating pitch of the IMU, it can be easily extended into 2D to estimate both pitch and roll.


1999 ◽  
Vol 8 (2) ◽  
pp. 187-203 ◽  
Author(s):  
Tom Molet ◽  
Ronan Boulic ◽  
Daniel Thalmann

Motion-capture techniques are rarely based on orientation measurements for two main reasons: (1) optical motion-capture systems are designed for tracking object position rather than their orientation (which can be deduced from several trackers), (2) known animation techniques, like inverse kinematics or geometric algorithms, require position targets constantly, but orientation inputs only occasionally. We propose a complete human motion-capture technique based essentially on orientation measurements. The position measurement is used only for recovering the global position of the performer. This method allows fast tracking of human gestures for interactive applications as well as high rate recording. Several motion-capture optimizations, including the multijoint technique, improve the posture realism. This work is well suited for magnetic-based systems that rely more on orientation registration (in our environment) than position measurements that necessitate difficult system calibration.


Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3748
Author(s):  
Leticia González ◽  
Juan C. Álvarez ◽  
Antonio M. López ◽  
Diego Álvarez

In the context of human–robot collaborative shared environments, there has been an increase in the use of optical motion capture (OMC) systems for human motion tracking. The accuracy and precision of OMC technology need to be assessed in order to ensure safe human–robot interactions, but the accuracy specifications provided by manufacturers are easily influenced by various factors affecting the measurements. This article describes a new methodology for the metrological evaluation of a human–robot collaborative environment based on optical motion capture (OMC) systems. Inspired by the ASTM E3064 test guide, and taking advantage of an existing industrial robot in the production cell, the system is evaluated for mean error, error spread, and repeatability. A detailed statistical study of the error distribution across the capture area is carried out, supported by a Mann–Whitney U-test for median comparisons. Based on the results, optimal capture areas for the use of the capture system are suggested. The results of the proposed method show that the metrological characteristics obtained are compatible and comparable in quality to other methods that do not require the intervention of an industrial robot.


2021 ◽  
Author(s):  
Mazen Al Borno ◽  
Johanna O'Day ◽  
Vanessa Ibarra ◽  
James Dunne ◽  
Ajay Seth ◽  
...  

Background: The ability to measure joint kinematics in natural environments over long durations using inertial measurement units (IMUs) could enable at-home monitoring and personalized treatment of neurological and musculoskeletal disorders. However, drift, or the accumulation of error over time, inhibits the accurate measurement of movement over long durations. We sought to develop an open-source workflow to estimate lower extremity joint kinematics from IMU data that was accurate, and capable of assessing and mitigating drift. Methods: We computed IMU-based estimates of kinematics using sensor fusion and an inverse kinematics approach with a constrained biomechanical model. We measured kinematics for 11 subjects as they performed two 10-minute trials: walking and a repeated sequence of varied lower-extremity movements. To validate the approach, we compared the joint angles computed with IMU orientations to the joint angles computed from optical motion capture using root mean square (RMS) difference and Pearson correlations, and estimated drift using a linear regression on each subject's RMS differences over time. Results: IMU-based kinematic estimates agreed with optical motion capture; median RMS differences over all subjects and all minutes were between 3-6 degrees for all joint angles except hip rotation and correlation coefficients were moderate to strong (r = 0.60 to 0.87). We observed minimal drift in the RMS differences over ten minutes; the average slopes of the linear fits to these data were near zero (-0.14 to 0.17 deg/min). Conclusions: Our workflow produced joint kinematics consistent with those estimated by optical motion capture, and could mitigate kinematic drift even in the trials of continuous walking without rest, obviating the need for explicit sensor recalibration (e.g. sitting or standing still for a few seconds or zero-velocity updates) used in current drift-mitigation approaches. This could enable long-duration measurements, bringing the field one step closer to estimating kinematics in natural environments.


Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5833
Author(s):  
Elke Warmerdam ◽  
Robbin Romijnders ◽  
Johanna Geritz ◽  
Morad Elshehabi ◽  
Corina Maetzler ◽  
...  

Healthy adults and neurological patients show unique mobility patterns over the course of their lifespan and disease. Quantifying these mobility patterns could support diagnosing, tracking disease progression and measuring response to treatment. This quantification can be done with wearable technology, such as inertial measurement units (IMUs). Before IMUs can be used to quantify mobility, algorithms need to be developed and validated with age and disease-specific datasets. This study proposes a protocol for a dataset that can be used to develop and validate IMU-based mobility algorithms for healthy adults (18–60 years), healthy older adults (>60 years), and patients with Parkinson’s disease, multiple sclerosis, a symptomatic stroke and chronic low back pain. All participants will be measured simultaneously with IMUs and a 3D optical motion capture system while performing standardized mobility tasks and non-standardized activities of daily living. Specific clinical scales and questionnaires will be collected. This study aims at building the largest dataset for the development and validation of IMU-based mobility algorithms for healthy adults and neurological patients. It is anticipated to provide this dataset for further research use and collaboration, with the ultimate goal to bring IMU-based mobility algorithms as quickly as possible into clinical trials and clinical routine.


2020 ◽  
pp. 1-8
Author(s):  
Jonathan S. Dufour ◽  
Alexander M. Aurand ◽  
Eric B. Weston ◽  
Christopher N. Haritos ◽  
Reid A. Souchereau ◽  
...  

The objective of this study was to test the feasibility of using a pair of wearable inertial measurement unit (IMU) sensors to accurately capture dynamic joint motion data during simulated occupational conditions. Eleven subjects (5 males and 6 females) performed repetitive neck, low-back, and shoulder motions simulating low- and high-difficulty occupational tasks in a laboratory setting. Kinematics for each of the 3 joints were measured via IMU sensors in addition to a “gold standard” passive marker optical motion capture system. The IMU accuracy was benchmarked relative to the optical motion capture system, and IMU sensitivity to low- and high-difficulty tasks was evaluated. The accuracy of the IMU sensors was found to be very good on average, but significant positional drift was observed in some trials. In addition, IMU measurements were shown to be sensitive to differences in task difficulty in all 3 joints (P < .05). These results demonstrate the feasibility for using wearable IMU sensors to capture kinematic exposures as potential indicators of occupational injury risk. Velocities and accelerations demonstrate the most potential for developing risk metrics since they are sensitive to task difficulty and less sensitive to drift than rotational position measurements.


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Agnieszka Szczęsna ◽  
Monika Błaszczyszyn ◽  
Magdalena Pawlyta

AbstractHuman motion capture is commonly used in various fields, including sport, to analyze, understand, and synthesize kinematic and kinetic data. Specialized computer vision and marker-based optical motion capture techniques constitute the gold-standard for accurate and robust human motion capture. The dataset presented consists of recordings of 37 Kyokushin karate athletes of different ages (children, young people, and adults) and skill levels (from 4th dan to 9th kyu) executing the following techniques: reverse lunge punch (Gyaku-Zuki), front kick (Mae-Geri), roundhouse kick (Mawashi-Geri), and spinning back kick (Ushiro-Mawashi-Geri). Each technique was performed approximately three times per recording (i.e., to create a single data file), and under three conditions where participants kicked or punched (i) in the air, (ii) a training shield, or (iii) an opponent. Each participant undertook a minimum of two trials per condition. The data presented was captured using a Vicon optical motion capture system with Plug-In Gait software. Three dimensional trajectories of 39 reflective markers were recorded. The resultant dataset contains a total of 1,411 recordings, with 3,229 single kicks and punches. The recordings are available in C3D file format. The dataset provides the opportunity for kinematic analysis of different combat sport techniques in attacking and defensive situations.


2012 ◽  
Vol 198-199 ◽  
pp. 1062-1066
Author(s):  
Xu Dong Wu ◽  
Hu Liu ◽  
Song Mo ◽  
Zhe Wu ◽  
Ying Li

Maintainability is a main design attribute of a civil airplane. So evaluation to the maintainability is very important for the design of the civil aircraft. The Human motion capture is a promising technique for maintainability evaluation. In this article, a method about the maintainability evaluation based on the optical motion capture system was presented. Then a test case was developed to demonstrate the feasibility of the method. The test case mainly focused on the accessibility of the civil aircraft.


Sign in / Sign up

Export Citation Format

Share Document