scholarly journals Feature Fusion: H-ELM based Learned Features and Hand-Crafted Features for Human Activity Recognition

Author(s):  
Nouar AlDahoul ◽  
Rini Akmeliawati ◽  
Zaw Zaw
Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8294
Author(s):  
Chih-Ta Yen ◽  
Jia-Xian Liao ◽  
Yi-Kai Huang

This paper presents a wearable device, fitted on the waist of a participant that recognizes six activities of daily living (walking, walking upstairs, walking downstairs, sitting, standing, and laying) through a deep-learning algorithm, human activity recognition (HAR). The wearable device comprises a single-board computer (SBC) and six-axis sensors. The deep-learning algorithm employs three parallel convolutional neural networks for local feature extraction and for subsequent concatenation to establish feature fusion models of varying kernel size. By using kernels of different sizes, relevant local features of varying lengths were identified, thereby increasing the accuracy of human activity recognition. Regarding experimental data, the database of University of California, Irvine (UCI) and self-recorded data were used separately. The self-recorded data were obtained by having 21 participants wear the device on their waist and perform six common activities in the laboratory. These data were used to verify the proposed deep-learning algorithm on the performance of the wearable device. The accuracy of these six activities in the UCI dataset and in the self-recorded data were 97.49% and 96.27%, respectively. The accuracies in tenfold cross-validation were 99.56% and 97.46%, respectively. The experimental results have successfully verified the proposed convolutional neural network (CNN) architecture, which can be used in rehabilitation assessment for people unable to exercise vigorously.


2020 ◽  
Vol 69 (7) ◽  
pp. 3992-4001 ◽  
Author(s):  
Zhenghua Chen ◽  
Chaoyang Jiang ◽  
Shili Xiang ◽  
Jie Ding ◽  
Min Wu ◽  
...  

Sensors ◽  
2019 ◽  
Vol 19 (7) ◽  
pp. 1556 ◽  
Author(s):  
Carlos Avilés-Cruz ◽  
Andrés Ferreyra-Ramírez ◽  
Arturo Zúñiga-López ◽  
Juan Villegas-Cortéz

In the last decade, deep learning techniques have further improved human activity recognition (HAR) performance on several benchmark datasets. This paper presents a novel framework to classify and analyze human activities. A new convolutional neural network (CNN) strategy is applied to a single user movement recognition using a smartphone. Three parallel CNNs are used for local feature extraction, and latter they are fused in the classification task stage. The whole CNN scheme is based on a feature fusion of a fine-CNN, a medium-CNN, and a coarse-CNN. A tri-axial accelerometer and a tri-axial gyroscope sensor embedded in a smartphone are used to record the acceleration and angle signals. Six human activities successfully classified are walking, walking-upstairs, walking-downstairs, sitting, standing and laying. Performance evaluation is presented for the proposed CNN.


Author(s):  
Lidia Bajenaru ◽  
Ciprian Dobre ◽  
Radu-Ioan Ciobanu ◽  
Georgiana Dedu ◽  
Silviu-George Pantelimon ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document