Context Awareness and Step Length Estimation by Shape Distance and H-Features

2020 ◽  
Vol 18 (12) ◽  
pp. 3051-3061
Author(s):  
Daehyun Kim ◽  
Yonghyeon Lee ◽  
Chan Gook Park
Sensors ◽  
2021 ◽  
Vol 21 (10) ◽  
pp. 3527
Author(s):  
Melanija Vezočnik ◽  
Roman Kamnik ◽  
Matjaz B. Juric

Inertial sensor-based step length estimation has become increasingly important with the emergence of pedestrian-dead-reckoning-based (PDR-based) indoor positioning. So far, many refined step length estimation models have been proposed to overcome the inaccuracy in estimating distance walked. Both the kinematics associated with the human body during walking and actual step lengths are rarely used in their derivation. Our paper presents a new step length estimation model that utilizes acceleration magnitude. To the best of our knowledge, we are the first to employ principal component analysis (PCA) to characterize the experimental data for the derivation of the model. These data were collected from anatomical landmarks on the human body during walking using a highly accurate optical measurement system. We evaluated the performance of the proposed model for four typical smartphone positions for long-term human walking and obtained promising results: the proposed model outperformed all acceleration-based models selected for the comparison producing an overall mean absolute stride length estimation error of 6.44 cm. The proposed model was also least affected by walking speed and smartphone position among acceleration-based models and is unaffected by smartphone orientation. Therefore, the proposed model can be used in the PDR-based indoor positioning with an important advantage that no special care regarding orientation is needed in attaching the smartphone to a particular body segment. All the sensory data acquired by smartphones that we utilized for evaluation are publicly available and include more than 10 h of walking measurements.


Sensors ◽  
2020 ◽  
Vol 20 (19) ◽  
pp. 5570
Author(s):  
Yiming Ding ◽  
Zhi Xiong ◽  
Wanling Li ◽  
Zhiguo Cao ◽  
Zhengchun Wang

The combination of biomechanics and inertial pedestrian navigation research provides a very promising approach for pedestrian positioning in environments where Global Positioning System (GPS) signal is unavailable. However, in practical applications such as fire rescue and indoor security, the inertial sensor-based pedestrian navigation system is facing various challenges, especially the step length estimation errors and heading drift in running and sprint. In this paper, a trinal-node, including two thigh-worn inertial measurement units (IMU) and one waist-worn IMU, based simultaneous localization and occupation grid mapping method is proposed. Specifically, the gait detection and segmentation are realized by the zero-crossing detection of the difference of thighs pitch angle. A piecewise function between the step length and the probability distribution of waist horizontal acceleration is established to achieve accurate step length estimation both in regular walking and drastic motions. In addition, the simultaneous localization and mapping method based on occupancy grids, which involves the historic trajectory to improve the pedestrian’s pose estimation is introduced. The experiments show that the proposed trinal-node pedestrian inertial odometer can identify and segment each gait cycle in the walking, running, and sprint. The average step length estimation error is no more than 3.58% of the total travel distance in the motion speed from 1.23 m/s to 3.92 m/s. In combination with the proposed simultaneous localization and mapping method based on the occupancy grid, the localization error is less than 5 m in a single-story building of 2643.2 m2.


Sensors ◽  
2019 ◽  
Vol 20 (1) ◽  
pp. 214 ◽  
Author(s):  
Itzik Klein

One of the approaches for indoor positioning using smartphones is pedestrian dead reckoning. There, the user step length is estimated using empirical or biomechanical formulas. Such calculation was shown to be very sensitive to the smartphone location on the user. In addition, knowledge of the smartphone location can also help for direct step-length estimation and heading determination. In a wider point of view, smartphone location recognition is part of human activity recognition employed in many fields and applications, such as health monitoring. In this paper, we propose to use deep learning approaches to classify the smartphone location on the user, while walking, and require robustness in terms of the ability to cope with recordings that differ (in sampling rate, user dynamics, sensor type, and more) from those available in the train dataset. The contributions of the paper are: (1) Definition of the smartphone location recognition framework using accelerometers, gyroscopes, and deep learning; (2) examine the proposed approach on 107 people and 31 h of recorded data obtained from eight different datasets; and (3) enhanced algorithms for using only accelerometers for the classification process. The experimental results show that the smartphone location can be classified with high accuracy using only the smartphone’s accelerometers.


Sign in / Sign up

Export Citation Format

Share Document