Accuracy Evaluation Method of Inertial Measurement Unit Using Optical Motion Capture System

Author(s):  
Koji Fujita ◽  
Daisuke Kubo ◽  
Akira Oyama ◽  
Hiroki Nagai
Diagnostics ◽  
2020 ◽  
Vol 10 (6) ◽  
pp. 426
Author(s):  
I. Concepción Aranda-Valera ◽  
Antonio Cuesta-Vargas ◽  
Juan L. Garrido-Castro ◽  
Philip V. Gardiner ◽  
Clementina López-Medina ◽  
...  

Portable inertial measurement units (IMUs) are beginning to be used in human motion analysis. These devices can be useful for the evaluation of spinal mobility in individuals with axial spondyloarthritis (axSpA). The objectives of this study were to assess (a) concurrent criterion validity in individuals with axSpA by comparing spinal mobility measured by an IMU sensor-based system vs. optical motion capture as the reference standard; (b) discriminant validity comparing mobility with healthy volunteers; (c) construct validity by comparing mobility results with relevant outcome measures. A total of 70 participants with axSpA and 20 healthy controls were included. Individuals with axSpA completed function and activity questionnaires, and their mobility was measured using conventional metrology for axSpA, an optical motion capture system, and an IMU sensor-based system. The UCOASMI, a metrology index based on measures obtained by motion capture, and the IUCOASMI, the same index using IMU measures, were also calculated. Descriptive and inferential analyses were conducted to show the relationships between outcome measures. There was excellent agreement (ICC > 0.90) between both systems and a significant correlation between the IUCOASMI and conventional metrology (r = 0.91), activity (r = 0.40), function (r = 0.62), quality of life (r = 0.55) and structural change (r = 0.76). This study demonstrates the validity of an IMU system to evaluate spinal mobility in axSpA. These systems are more feasible than optical motion capture systems, and they could be useful in clinical practice.


2019 ◽  
Vol 13 (4) ◽  
pp. 506-516 ◽  
Author(s):  
Tsubasa Maruyama ◽  
Mitsunori Tada ◽  
Haruki Toda ◽  
◽  

The measurement of human motion is an important aspect of ergonomic mobility design, in which the mobility product is evaluated based on human factors obtained by digital human (DH) technologies. The optical motion-capture (MoCap) system has been widely used for measuring human motion in laboratories. However, it is generally difficult to measure human motion using mobility products in real-world scenarios, e.g., riding a bicycle on an outdoor slope, owing to unstable lighting conditions and camera arrangements. On the other hand, the inertial-measurement-unit (IMU)-based MoCap system does not require any optical devices, providing the potential for measuring riding motion even in outdoor environments. However, in general, the estimated motion is not necessarily accurate as there are many errors due to the nature of the IMU itself, such as drift and calibration errors. Thus, it is infeasible to apply the IMU-based system to riding motion estimation. In this study, we develop a new riding MoCap system using IMUs. The proposed system estimates product and human riding motions by combining the IMU orientation with contact constraints between the product and DH, e.g., DH hands in contact with handles. The proposed system is demonstrated with a bicycle ergometer, including the handles, seat, backrest, and foot pedals, as in general mobility products. The proposed system is further validated by comparing the estimated joint angles and positions with those of the optical MoCap for three different subjects. The experiment reveals both the effectiveness and limitations of the proposed system. It is confirmed that the proposed system improves the joint position estimation accuracy compared with a system using only IMUs. The angle estimation accuracy is also improved for near joints. However, it is observed that the angle accuracy decreases for a few joints. This is explained by the fact that the proposed system modifies the orientations of all body segments to satisfy the contact constraints, even if the orientations of a few joints are correct. This further confirms that the elapsed time using the proposed system is sufficient for real-time application.


Sensors ◽  
2018 ◽  
Vol 18 (11) ◽  
pp. 4003 ◽  
Author(s):  
Jung Keun Lee ◽  
Woo Chang Jung

Local frame alignment between an inertial measurement unit (IMU) system and an optical motion capture system (MCS) is necessary to combine the two systems for motion analysis and to validate the accuracy of IMU-based motion data by using references obtained through the MCS. In this study, we propose a new quaternion-based local frame alignment method where equations of angular velocity transformation are used to determine the frame alignment orientation in the form of quaternion. The performance of the proposed method was compared with those of three other methods by using data with different angular velocities, noises, and alignment orientations. Furthermore, the effects of the following three factors on the estimation performance were investigated for the first time: (i) transformation concept, i.e., angular velocity transformation vs. angle transformation; (ii) orientation representations, i.e., quaternion vs. direction cosine matrix (DCM); and (iii) applied solvers, i.e., nonlinear least squares method vs. least squares method through pseudoinverse. Within our limited test data, we obtained the following results: (i) the methods using angular velocity transformation were better than the method using angle transformation; (ii) the quaternion is more suitable than the DCM; and (iii) the applied solvers were not critical in general. The proposed method performed the best among the four methods. We surmise that the fewer number of components and constraints of the quaternion in the proposed method compared to the number of components and constraints of the DCM-based methods may result in better accuracy. Owing to the high accuracy and easy setup, the proposed method can be effectively used for local frame alignment between an IMU and a motion capture system.


2020 ◽  
pp. 1-8
Author(s):  
Jonathan S. Dufour ◽  
Alexander M. Aurand ◽  
Eric B. Weston ◽  
Christopher N. Haritos ◽  
Reid A. Souchereau ◽  
...  

The objective of this study was to test the feasibility of using a pair of wearable inertial measurement unit (IMU) sensors to accurately capture dynamic joint motion data during simulated occupational conditions. Eleven subjects (5 males and 6 females) performed repetitive neck, low-back, and shoulder motions simulating low- and high-difficulty occupational tasks in a laboratory setting. Kinematics for each of the 3 joints were measured via IMU sensors in addition to a “gold standard” passive marker optical motion capture system. The IMU accuracy was benchmarked relative to the optical motion capture system, and IMU sensitivity to low- and high-difficulty tasks was evaluated. The accuracy of the IMU sensors was found to be very good on average, but significant positional drift was observed in some trials. In addition, IMU measurements were shown to be sensitive to differences in task difficulty in all 3 joints (P < .05). These results demonstrate the feasibility for using wearable IMU sensors to capture kinematic exposures as potential indicators of occupational injury risk. Velocities and accelerations demonstrate the most potential for developing risk metrics since they are sensitive to task difficulty and less sensitive to drift than rotational position measurements.


Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 596 ◽  
Author(s):  
Nimsiri Abhayasinghe ◽  
Iain Murray ◽  
Shiva Sharif Bidabadi

Inertial measurement units are commonly used to estimate the orientation of sections of sections of human body in inertial navigation systems. Most of the algorithms used for orientation estimation are computationally expensive and it is difficult to implement them in real-time embedded systems with restricted capabilities. This paper discusses a computationally inexpensive orientation estimation algorithm (Gyro Integration-Based Orientation Filter—GIOF) that is used to estimate the forward and backward swing angle of the thigh (thigh angle) for a vision impaired navigation aid. The algorithm fuses the accelerometer and gyroscope readings to derive the single dimension orientation in such a way that the orientation is corrected using the accelerometer reading when it reads gravity only or otherwise integrate the gyro reading to estimate the orientation. This strategy was used to reduce the drift caused by the gyro integration. The thigh angle estimated by GIOF was compared against the Vicon Optical Motion Capture System and reported a mean correlation of 99.58% for 374 walking trials with a standard deviation of 0.34%. The Root Mean Square Error (RMSE) of the thigh angle estimated by GIOF compared with Vicon measurement was 1.8477°. The computation time on an 8-bit microcontroller running at 8 MHz for GIOF is about a half of that of Complementary Filter implementation. Although GIOF was only implemented and tested for estimating pitch of the IMU, it can be easily extended into 2D to estimate both pitch and roll.


Sign in / Sign up

Export Citation Format

Share Document