scholarly journals Recognition of Daily Human Activity Using an Artificial Neural Network and Smartwatch

2018 ◽  
Vol 2018 ◽  
pp. 1-9 ◽  
Author(s):  
Min-Cheol Kwon ◽  
Sunwoong Choi

Human activity recognition using wearable devices has been actively investigated in a wide range of applications. Most of them, however, either focus on simple activities wherein whole body movement is involved or require a variety of sensors to identify daily activities. In this study, we propose a human activity recognition system that collects data from an off-the-shelf smartwatch and uses an artificial neural network for classification. The proposed system is further enhanced using location information. We consider 11 activities, including both simple and daily activities. Experimental results show that various activities can be classified with an accuracy of 95%.

2021 ◽  
Vol 11 (1) ◽  
pp. 411-422
Author(s):  
Jozsef Suto

Abstract In the last decade, many researchers applied shallow and deep networks for human activity recognition (HAR). Currently, the trending research line in HAR is applying deep learning to extract features and classify activities from raw data. However, we observed that, authors of previous studies have not performed an efficient hyperparameter search on their artificial neural network (shallow or deep)-based classifier. Therefore, in this article, we demonstrate the effect of the random and Bayesian parameter search on a shallow neural network using five HAR databases. The result of this work shows that a shallow neural network with correct parameter optimization can achieve similar or even better recognition accuracy than the previous best deep classifier(s) on all databases. In addition, we draw conclusions about the advantages and disadvantages of the two hyperparameter search techniques according to the results.


2021 ◽  
Vol 3 (2) ◽  
Author(s):  
Charles Gbenga Williams ◽  
Oluwapelumi O. Ojuri

AbstractAs a result of heterogeneity nature of soils and variation in its hydraulic conductivity over several orders of magnitude for various soil types from fine-grained to coarse-grained soils, predictive methods to estimate hydraulic conductivity of soils from properties considered more easily obtainable have now been given an appropriate consideration. This study evaluates the performance of artificial neural network (ANN) being one of the popular computational intelligence techniques in predicting hydraulic conductivity of wide range of soil types and compared with the traditional multiple linear regression (MLR). ANN and MLR models were developed using six input variables. Results revealed that only three input variables were statistically significant in MLR model development. Performance evaluations of the developed models using determination coefficient and mean square error show that the prediction capability of ANN is far better than MLR. In addition, comparative study with available existing models shows that the developed ANN and MLR in this study performed relatively better.


Electronics ◽  
2021 ◽  
Vol 10 (14) ◽  
pp. 1715
Author(s):  
Michele Alessandrini ◽  
Giorgio Biagetti ◽  
Paolo Crippa ◽  
Laura Falaschetti ◽  
Claudio Turchetti

Photoplethysmography (PPG) is a common and practical technique to detect human activity and other physiological parameters and is commonly implemented in wearable devices. However, the PPG signal is often severely corrupted by motion artifacts. The aim of this paper is to address the human activity recognition (HAR) task directly on the device, implementing a recurrent neural network (RNN) in a low cost, low power microcontroller, ensuring the required performance in terms of accuracy and low complexity. To reach this goal, (i) we first develop an RNN, which integrates PPG and tri-axial accelerometer data, where these data can be used to compensate motion artifacts in PPG in order to accurately detect human activity; (ii) then, we port the RNN to an embedded device, Cloud-JAM L4, based on an STM32 microcontroller, optimizing it to maintain an accuracy of over 95% while requiring modest computational power and memory resources. The experimental results show that such a system can be effectively implemented on a constrained-resource system, allowing the design of a fully autonomous wearable embedded system for human activity recognition and logging.


Author(s):  
Muhammad Muaaz ◽  
Ali Chelli ◽  
Martin Wulf Gerdes ◽  
Matthias Pätzold

AbstractA human activity recognition (HAR) system acts as the backbone of many human-centric applications, such as active assisted living and in-home monitoring for elderly and physically impaired people. Although existing Wi-Fi-based human activity recognition methods report good results, their performance is affected by the changes in the ambient environment. In this work, we present Wi-Sense—a human activity recognition system that uses a convolutional neural network (CNN) to recognize human activities based on the environment-independent fingerprints extracted from the Wi-Fi channel state information (CSI). First, Wi-Sense captures the CSI by using a standard Wi-Fi network interface card. Wi-Sense applies the CSI ratio method to reduce the noise and the impact of the phase offset. In addition, it applies the principal component analysis to remove redundant information. This step not only reduces the data dimension but also removes the environmental impact. Thereafter, we compute the processed data spectrogram which reveals environment-independent time-variant micro-Doppler fingerprints of the performed activity. We use these spectrogram images to train a CNN. We evaluate our approach by using a human activity data set collected from nine volunteers in an indoor environment. Our results show that Wi-Sense can recognize these activities with an overall accuracy of 97.78%. To stress on the applicability of the proposed Wi-Sense system, we provide an overview of the standards involved in the health information systems and systematically describe how Wi-Sense HAR system can be integrated into the eHealth infrastructure.


Sign in / Sign up

Export Citation Format

Share Document