Internet of things sensors assisted physical activity recognition and health monitoring of college students

Measurement ◽  
2020 ◽  
Vol 159 ◽  
pp. 107774 ◽  
Author(s):  
Chun-Li Zhong ◽  
Yuan-le Li
2020 ◽  
Vol 55 ◽  
pp. 269-280 ◽  
Author(s):  
Jun Qi ◽  
Po Yang ◽  
Lee Newcombe ◽  
Xiyang Peng ◽  
Yun Yang ◽  
...  

2020 ◽  
Vol 10 (20) ◽  
pp. 7122
Author(s):  
Ahmad Jalal ◽  
Mouazma Batool ◽  
Kibum Kim

The classification of human activity is becoming one of the most important areas of human health monitoring and physical fitness. With the use of physical activity recognition applications, people suffering from various diseases can be efficiently monitored and medical treatment can be administered in a timely fashion. These applications could improve remote services for health care monitoring and delivery. However, the fixed health monitoring devices provided in hospitals limits the subjects’ movement. In particular, our work reports on wearable sensors that provide remote monitoring that periodically checks human health through different postures and activities to give people timely and effective treatment. In this paper, we propose a novel human activity recognition (HAR) system with multiple combined features to monitor human physical movements from continuous sequences via tri-axial inertial sensors. The proposed HAR system filters 1D signals using a notch filter that examines the lower/upper cutoff frequencies to calculate the optimal wearable sensor data. Then, it calculates multiple combined features, i.e., statistical features, Mel Frequency Cepstral Coefficients, and Gaussian Mixture Model features. For the classification and recognition engine, a Decision Tree classifier optimized by the Binary Grey Wolf Optimization algorithm is proposed. The proposed system is applied and tested on three challenging benchmark datasets to assess the feasibility of the model. The experimental results show that our proposed system attained an exceptional level of performance compared to conventional solutions. We achieved accuracy rates of 88.25%, 93.95%, and 96.83% over MOTIONSENSE, MHEALTH, and the proposed self-annotated IM-AccGyro human-machine dataset, respectively.


Sign in / Sign up

Export Citation Format

Share Document