Real-Time Gait Phase Recognition Based on Time Domain Features of Multi-MEMS Inertial Sensors

2021 ◽  
Vol 70 ◽  
pp. 1-12
Author(s):  
Meiyan Zhang ◽  
Qisong Wang ◽  
Dan Liu ◽  
Boqi Zhao ◽  
Jiaze Tang ◽  
...  
Sensors ◽  
2018 ◽  
Vol 18 (3) ◽  
pp. 919 ◽  
Author(s):  
Ilaria Mileti ◽  
Marco Germanotta ◽  
Enrica Di Sipio ◽  
Isabella Imbimbo ◽  
Alessandra Pacilli ◽  
...  

Author(s):  
Yue-Peng Zhang ◽  
Guang-Zhong Cao ◽  
Zi-Qin Ling ◽  
Bin-Bin He ◽  
Hao-Ran Cheng ◽  
...  

2014 ◽  
Vol 53 (7S) ◽  
pp. 07KC14 ◽  
Author(s):  
Tan Yiyu ◽  
Yasushi Inoguchi ◽  
Yukinori Sato ◽  
Makoto Otani ◽  
Yukio Iwaya ◽  
...  

2017 ◽  
Vol 2017 ◽  
pp. 1-11 ◽  
Author(s):  
Chunjie Chen ◽  
Xinyu Wu ◽  
Du-xin Liu ◽  
Wei Feng ◽  
Can Wang

The wearable full-body exoskeleton robot developed in this study is one application of mobile cyberphysical system (CPS), which is a complex mobile system integrating mechanics, electronics, computer science, and artificial intelligence. Steel wire was used as the flexible transmission medium and a group of special wire-locking structures was designed. Additionally, we designed passive joints for partial joints of the exoskeleton. Finally, we proposed a novel gait phase recognition method for full-body exoskeletons using only joint angular sensors, plantar pressure sensors, and inclination sensors. The method consists of four procedures. Firstly, we classified the three types of main motion patterns: normal walking on the ground, stair-climbing and stair-descending, and sit-to-stand movement. Secondly, we segregated the experimental data into one gait cycle. Thirdly, we divided one gait cycle into eight gait phases. Finally, we built a gait phase recognition model based on k-Nearest Neighbor perception and trained it with the phase-labeled gait data. The experimental result shows that the model has a 98.52% average correct rate of classification of the main motion patterns on the testing set and a 95.32% average correct rate of phase recognition on the testing set. So the exoskeleton robot can achieve human motion intention in real time and coordinate its movement with the wearer.


Sensors ◽  
2018 ◽  
Vol 18 (10) ◽  
pp. 3521 ◽  
Author(s):  
Funa Zhou ◽  
Po Hu ◽  
Shuai Yang ◽  
Chenglin Wen

Rotating machinery usually suffers from a type of fault, where the fault feature extracted in the frequency domain is significant, while the fault feature extracted in the time domain is insignificant. For this type of fault, a deep learning-based fault diagnosis method developed in the frequency domain can reach high accuracy performance without real-time performance, whereas a deep learning-based fault diagnosis method developed in the time domain obtains real-time diagnosis with lower diagnosis accuracy. In this paper, a multimodal feature fusion-based deep learning method for accurate and real-time online diagnosis of rotating machinery is proposed. The proposed method can directly extract the potential frequency of abnormal features involved in the time domain data. Firstly, multimodal features corresponding to the original data, the slope data, and the curvature data are firstly extracted by three separate deep neural networks. Then, a multimodal feature fusion is developed to obtain a new fused feature that can characterize the potential frequency feature involved in the time domain data. Lastly, the fused new feature is used as the input of the Softmax classifier to achieve a real-time online diagnosis result from the frequency-type fault data. A simulation experiment and a case study of the bearing fault diagnosis confirm the high efficiency of the method proposed in this paper.


Sign in / Sign up

Export Citation Format

Share Document