Sensitivity of Sensor Locations in a Wearable IMU-based Human Activity Recognition System

Author(s):  
Mehdi Ejtehadi ◽  
Amin M. Nasrabadi ◽  
Saeed Behzadipour

Abstract Background: The advent of Inertial measurement unit (IMU) sensors has significantly extended the application domain of Human Activity Recognition (HAR) systems to healthcare, tele-rehabilitation & daily life monitoring. IMU’s are categorized as body-worn sensors and therefore their output signals and the HAR performance naturally depends on their exact location on the body segments. Objectives: This research aims to introduce a methodology to investigate the effects of misplacing the sensors on the performance of the HAR systems. Methods: The properly placed sensors and their misplaced variations were modeled on a human body kinematic model. The model was then actuated using measured motions from human subjects. The model was then used to run a sensitivity analysis. Results: The results indicated that the transverse misplacement of the sensors on the left arm and right thigh and the rotation of the left thigh sensor significantly decrease the rate of activity recognition. It was also shown that the longitudinal displacements of the sensors (along the body segments) have minor impacts on the HAR performance. A Monte Carlo simulation indicated that if the sensitive sensors are mounted with extra care, the performance can be maintained at a higher than 95% level.Conclusions: Accurate mounting of the IMU’s on the body impacts the performance of the HAR. Particularly, the transverse position and rotation of the IMU’s are more sensitive. The users of such systems need to be informed about the more sensitive sensors and directions to maintain an acceptable performance for the HAR.

Sensors ◽  
2021 ◽  
Vol 21 (3) ◽  
pp. 885 ◽  
Author(s):  
Zhongzheng Fu ◽  
Xinrun He ◽  
Enkai Wang ◽  
Jun Huo ◽  
Jian Huang ◽  
...  

Human activity recognition (HAR) based on the wearable device has attracted more attention from researchers with sensor technology development in recent years. However, personalized HAR requires high accuracy of recognition, while maintaining the model’s generalization capability is a major challenge in this field. This paper designed a compact wireless wearable sensor node, which combines an air pressure sensor and inertial measurement unit (IMU) to provide multi-modal information for HAR model training. To solve personalized recognition of user activities, we propose a new transfer learning algorithm, which is a joint probability domain adaptive method with improved pseudo-labels (IPL-JPDA). This method adds the improved pseudo-label strategy to the JPDA algorithm to avoid cumulative errors due to inaccurate initial pseudo-labels. In order to verify our equipment and method, we use the newly designed sensor node to collect seven daily activities of 7 subjects. Nine different HAR models are trained by traditional machine learning and transfer learning methods. The experimental results show that the multi-modal data improve the accuracy of the HAR system. The IPL-JPDA algorithm proposed in this paper has the best performance among five HAR models, and the average recognition accuracy of different subjects is 93.2%.


Sensors ◽  
2021 ◽  
Vol 21 (3) ◽  
pp. 692
Author(s):  
Jingcheng Chen ◽  
Yining Sun ◽  
Shaoming Sun

Human activity recognition (HAR) is essential in many health-related fields. A variety of technologies based on different sensors have been developed for HAR. Among them, fusion from heterogeneous wearable sensors has been developed as it is portable, non-interventional and accurate for HAR. To be applied in real-time use with limited resources, the activity recognition system must be compact and reliable. This requirement can be achieved by feature selection (FS). By eliminating irrelevant and redundant features, the system burden is reduced with good classification performance (CP). This manuscript proposes a two-stage genetic algorithm-based feature selection algorithm with a fixed activation number (GFSFAN), which is implemented on the datasets with a variety of time, frequency and time-frequency domain features extracted from the collected raw time series of nine activities of daily living (ADL). Six classifiers are used to evaluate the effects of selected feature subsets from different FS algorithms on HAR performance. The results indicate that GFSFAN can achieve good CP with a small size. A sensor-to-segment coordinate calibration algorithm and lower-limb joint angle estimation algorithm are introduced. Experiments on the effect of the calibration and the introduction of joint angle on HAR shows that both of them can improve the CP.


Author(s):  
Muhammad Muaaz ◽  
Ali Chelli ◽  
Martin Wulf Gerdes ◽  
Matthias Pätzold

AbstractA human activity recognition (HAR) system acts as the backbone of many human-centric applications, such as active assisted living and in-home monitoring for elderly and physically impaired people. Although existing Wi-Fi-based human activity recognition methods report good results, their performance is affected by the changes in the ambient environment. In this work, we present Wi-Sense—a human activity recognition system that uses a convolutional neural network (CNN) to recognize human activities based on the environment-independent fingerprints extracted from the Wi-Fi channel state information (CSI). First, Wi-Sense captures the CSI by using a standard Wi-Fi network interface card. Wi-Sense applies the CSI ratio method to reduce the noise and the impact of the phase offset. In addition, it applies the principal component analysis to remove redundant information. This step not only reduces the data dimension but also removes the environmental impact. Thereafter, we compute the processed data spectrogram which reveals environment-independent time-variant micro-Doppler fingerprints of the performed activity. We use these spectrogram images to train a CNN. We evaluate our approach by using a human activity data set collected from nine volunteers in an indoor environment. Our results show that Wi-Sense can recognize these activities with an overall accuracy of 97.78%. To stress on the applicability of the proposed Wi-Sense system, we provide an overview of the standards involved in the health information systems and systematically describe how Wi-Sense HAR system can be integrated into the eHealth infrastructure.


Author(s):  
Anirban Mukherjee ◽  
Amitrajit Bose ◽  
Debdeep Paul Chaudhuri ◽  
Akash Kumar ◽  
Aiswarya Chatterjee ◽  
...  

Author(s):  
Chaudhari Shraddha

Activity recognition in humans is one of the active challenges that find its application in numerous fields such as, medical health care, military, manufacturing, assistive techniques and gaming. Due to the advancements in technologies the usage of smartphones in human lives has become inevitable. The sensors in the smartphones help us to measure the essential vital parameters. These measured parameters enable us to monitor the activities of humans, which we call as human activity recognition. We have applied machine learning techniques on a publicly available dataset. K-Nearest Neighbors and Random Forest classification algorithms are applied. In this paper, we have designed and implemented an automatic human activity recognition system that independently recognizes the actions of the humans. This system is able to recognize the activities such as Laying, Sitting, Standing, Walking, Walking downstairs and Walking upstairs. The results obtained show that, the KNN and Random Forest Algorithms gives 90.22% and 92.70% respectively of overall accuracy in detecting the activities.


2018 ◽  
Vol 232 ◽  
pp. 04024
Author(s):  
Yuchen Wang ◽  
Mantao Wang ◽  
Zhouyu Tan ◽  
Jie Zhang ◽  
Zhiyong Li ◽  
...  

With the growth of building monitoring network, increasing human resource and funds have been invested into building monitoring system. Computer vision technology has been widely used in image recognition recently, and this technology has also been gradually applied to action recognition. There are still many disadvantages of traditional monitoring system. In this paper, a human activity recognition system which based on the convolution neural network is proposed. Using the 3D convolution neural network and the transfer learning technology, the human activity recognition engine is constructed. The Spring MVC framework is used to build the server end, and the system page is designed in HBuilder. The system not only enhances efficiency and functionality of building monitoring system, but also improves the level of building safety.


Sign in / Sign up

Export Citation Format

Share Document