Probabilistic Locomotion Mode Recognition with Wearable Sensors

Author(s):  
Uriel Martinez-Hernandez ◽  
Imran Mahmood ◽  
Abbas A. Dehghani-Sanij
Author(s):  
Baojun Chen ◽  
Vito Papapicco ◽  
Andrea Parri ◽  
Simona Crea ◽  
Marko Munih ◽  
...  

2020 ◽  
Author(s):  
Chaoming Fang ◽  
Yixuan Wang ◽  
Shuo Gao

In order to quantify the manipulation process of acupuncture, in this article, a piezoelectric glove based wearable stress sensing system is presented. Served as the sensitive element with small volume and high tensile resistance, PVDF greatly meet the need of quantitative analysis. Through piezoelectric force sensing glove, the system is capable of detecting both perpendicular stress as well as shear stress. Besides, key parameters including peak stress at needle are detected and extracted, potentially allowing for a higher learning efficiency hence advancing the development of acupuncture.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 526
Author(s):  
Yang Han ◽  
Chunbao Liu ◽  
Lingyun Yan ◽  
Lei Ren

Smart wearable robotic system, such as exoskeleton assist device and powered lower limb prostheses can rapidly and accurately realize man–machine interaction through locomotion mode recognition system. However, previous locomotion mode recognition studies usually adopted more sensors for higher accuracy and effective intelligent algorithms to recognize multiple locomotion modes simultaneously. To reduce the burden of sensors on users and recognize more locomotion modes, we design a novel decision tree structure (DTS) based on using an improved backpropagation neural network (IBPNN) as judgment nodes named IBPNN-DTS, after analyzing the experimental locomotion mode data using the original values with a 200-ms time window for a single inertial measurement unit to hierarchically identify nine common locomotion modes (level walking at three kinds of speeds, ramp ascent/descent, stair ascent/descent, Sit, and Stand). In addition, we reduce the number of parameters in the IBPNN for structure optimization and adopted the artificial bee colony (ABC) algorithm to perform global search for initial weight and threshold value to eliminate system uncertainty because randomly generated initial values tend to result in a failure to converge or falling into local optima. Experimental results demonstrate that recognition accuracy of the IBPNN-DTS with ABC optimization (ABC-IBPNN-DTS) was up to 96.71% (97.29% for the IBPNN-DTS). Compared to IBPNN-DTS without optimization, the number of parameters in ABC-IBPNN-DTS shrank by 66% with only a 0.58% reduction in accuracy while the classification model kept high robustness.


Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7473
Author(s):  
Binbin Su ◽  
Yi-Xing Liu ◽  
Elena M. Gutierrez-Farewik

People walk on different types of terrain daily; for instance, level-ground walking, ramp and stair ascent and descent, and stepping over obstacles are common activities in daily life. Movement patterns change as people move from one terrain to another. The prediction of transitions between locomotion modes is important for developing assistive devices, such as exoskeletons, as the optimal assistive strategies may differ for different locomotion modes. The prediction of locomotion mode transitions is often accompanied by gait-event detection that provides important information during locomotion about critical events, such as foot contact (FC) and toe off (TO). In this study, we introduce a method to integrate locomotion mode prediction and gait-event identification into one machine learning framework, comprised of two multilayer perceptrons (MLP). Input features to the framework were from fused data from wearable sensors—specifically, electromyography sensors and inertial measurement units. The first MLP successfully identified FC and TO, FC events were identified accurately, and a small number of misclassifications only occurred near TO events. A small time difference (2.5 ms and −5.3 ms for FC and TO, respectively) was found between predicted and true gait events. The second MLP correctly identified walking, ramp ascent, and ramp descent transitions with the best aggregate accuracy of 96.3%, 90.1%, and 90.6%, respectively, with sufficient prediction time prior to the critical events. The models in this study demonstrate high accuracy in predicting transitions between different locomotion modes in the same side’s mid- to late stance of the stride prior to the step into the new mode using data from EMG and IMU sensors. Our results may help assistive devices achieve smooth and seamless transitions in different locomotion modes for those with motor disorders.


2017 ◽  
Vol 14 (5) ◽  
pp. 172988141773032 ◽  
Author(s):  
Hongchul Kim ◽  
Young June Shin ◽  
Jung Kim

This article presents a kinematic-based method for locomotion mode recognition, for use in the control of an exoskeleton for power augmentation, to implement natural and smooth locomotion transition. The difference in vertical foot position between a foot already in contact with ground and a foot newly in contact with the ground was calculated via kinematics for the entire exoskeleton and used to identify the locomotion mode with other sensor data including data on the knee joint angle and inclination of the thigh, shank, and foot. Locomotion on five different types of terrain—level-ground walking, stair ascent, stair descent, ramp ascent, and ramp descent—were identified using two-layer decision tree classes. An updating process is proposed to improve identification of the transition and accuracy using the foot inclination at the mid-stance. An average identification accuracy of more than 99% was achieved in experiments with eight subjects for single terrains (no terrain transitions) and hybrid terrains. The experimental results show that the proposed method can achieve high accuracy without significant misrecognition and minimize the delay in locomotion mode recognition of the exoskeleton.


2017 ◽  
Vol 22 (6) ◽  
pp. 2480-2491 ◽  
Author(s):  
Andrea Parri ◽  
Kebin Yuan ◽  
Dario Marconi ◽  
Tingfang Yan ◽  
Simona Crea ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document