Accuracy Evaluation of Human Gait Estimation by a Sparse Set of Inertial Measurement Units

Author(s):  
Tsubasa Maruyama ◽  
Haruki Toda ◽  
Suguru Kanoga ◽  
Mitsunori Tada ◽  
Yui Endo
Author(s):  
Pratima Saravanan ◽  
Jiyun Yao ◽  
Jessica Menold

Clinical gait analysis is used for diagnosing, assessing, and for monitoring a patient by analyzing their kinetics, kinematics and electromyography while walking. Traditionally, gait analysis is performed in a formal laboratory environment making use of several high-resolution cameras, either video or infrared. The subject is asked to walk on a force platform or a treadmill with several markers attached to their body, allowing cameras to capture the joint coordinates across time. The space required for such a laboratory is non-trivial and often the associated costs of such an experimental setup is prohibitively expensive. The current work aims to investigate the coupled use of a Microsoft Kinect and Inertial Measurement Units as a portable and cost-efficient gait analysis system. Past studies on assessing gait using either Kinect or Inertial Measurement Units concluded that they achieve medium reliability individually due to some drawbacks related to each sensor. In this study, we propose that a combined system is efficient in detecting different phases of human gait, and the combination of sensors complement each other by overcoming the individual sensor drawbacks. Preliminary findings indicate that the IMU sensors are efficient in providing gait kinematics such as step length, stride length, velocity, cadence, etc., whereas the Kinect sensor helps in studying the gait asymmetries by comparing the right and left joint, such as hips, knees, and ankle.


Author(s):  
Ahmed Halim ◽  
A. Abdellatif ◽  
Mohammed I Awad ◽  
Mostafa RA Atia

This paper aims to enhance the accuracy of human gait prediction using machine learning algorithms. Three classifiers are used in this paper: XGBoost, Random Forest, and SVM. A predefined dataset is used for feature extraction and classification. Gait prediction is determined during several locomotion activities: sitting (S), level walking (LW), ramp ascend (RA), ramp descend (RD), stair ascend (SA), stair descend (SD), and standing (ST). The results are gained for steady-state (SS) and overall (full) gait cycle. Two sets of sensors are used. The first set uses inertial measurement units only. The second set uses inertial measurement units, electromyography, and electro-goniometers. The comparison is based on prediction accuracy and prediction time. In addition, a comparison between the prediction times of XGBoost with CPU and GPU is introduced due to the easiness of using XGBoost with GPU. The results of this paper can help to choose a classifier for gait prediction that can obtain acceptable accuracy with fewer types of sensors.


2017 ◽  
Vol 3 (1) ◽  
pp. 7-10 ◽  
Author(s):  
Jan Kuschan ◽  
Henning Schmidt ◽  
Jörg Krüger

Abstract:This paper presents an analysis of two distinct human lifting movements regarding acceleration and angular velocity. For the first movement, the ergonomic one, the test persons produced the lifting power by squatting down, bending at the hips and knees only. Whereas performing the unergonomic one they bent forward lifting the box mainly with their backs. The measurements were taken by using a vest equipped with five Inertial Measurement Units (IMU) with 9 Dimensions of Freedom (DOF) each. In the following the IMU data captured for these two movements will be evaluated using statistics and visualized. It will also be discussed with respect to their suitability as features for further machine learning classifications. The reason for observing these movements is that occupational diseases of the musculoskeletal system lead to a reduction of the workers’ quality of life and extra costs for companies. Therefore, a vest, called CareJack, was designed to give the worker a real-time feedback about his ergonomic state while working. The CareJack is an approach to reduce the risk of spinal and back diseases. This paper will also present the idea behind it as well as its main components.


2021 ◽  
pp. 1-19
Author(s):  
Thomas Rietveld ◽  
Barry S. Mason ◽  
Victoria L. Goosey-Tolfrey ◽  
Lucas H. V. van der Woude ◽  
Sonja de Groot ◽  
...  

2020 ◽  
Vol 6 (3) ◽  
pp. 237-240
Author(s):  
Simon Beck ◽  
Bernhard Laufer ◽  
Sabine Krueger-Ziolek ◽  
Knut Moeller

AbstractDemographic changes and increasing air pollution entail that monitoring of respiratory parameters is in the focus of research. In this study, two customary inertial measurement units (IMUs) are used to measure the breathing rate by using quaternions. One IMU was located ventral, and one was located dorsal on the thorax with a belt. The relative angle between the quaternion of each IMU was calculated and compared to the respiratory frequency obtained by a spirometer, which was used as a reference. A frequency analysis of both signals showed that the obtained respiratory rates vary slightly (less than 0.2/min) between the two systems. The introduced belt can analyse the respiratory rate and can be used for surveillance tasks in clinical settings.


2021 ◽  
Vol 32 (4) ◽  
Author(s):  
Luigi D’Alfonso ◽  
Emanuele Garone ◽  
Pietro Muraca ◽  
Paolo Pugliese

AbstractIn this work, we face the problem of estimating the relative position and orientation of a camera and an object, when they are both equipped with inertial measurement units (IMUs), and the object exhibits a set of n landmark points with known coordinates (the so-called Pose estimation or PnP Problem). We present two algorithms that, fusing the information provided by the camera and the IMUs, solve the PnP problem with good accuracy. These algorithms only use the measurements given by IMUs’ inclinometers, as the magnetometers usually give inaccurate estimates of the Earth magnetic vector. The effectiveness of the proposed methods is assessed by numerical simulations and experimental tests. The results of the tests are compared with the most recent methods proposed in the literature.


Sign in / Sign up

Export Citation Format

Share Document