scholarly journals Sensor Fusion-Based Activity Recognition for Parkinson Patients

Author(s):  
Majid Bahrepour ◽  
Nirvana Meratnia ◽  
Zahra Taghikhaki ◽  
Paul J. M. Having
Sensors ◽  
2012 ◽  
Vol 12 (6) ◽  
pp. 8039-8054 ◽  
Author(s):  
Oresti Banos ◽  
Miguel Damas ◽  
Hector Pomares ◽  
Ignacio Rojas

2018 ◽  
Vol 11 (8) ◽  
pp. 3073-3087 ◽  
Author(s):  
Mohd Halim Mohd Noor ◽  
Zoran Salcic ◽  
Kevin I-Kai Wang

2014 ◽  
Vol 42 (1) ◽  
pp. 5-26 ◽  
Author(s):  
Oresti Banos ◽  
Miguel Damas ◽  
Alberto Guillen ◽  
Luis-Javier Herrera ◽  
Hector Pomares ◽  
...  

Sensors ◽  
2019 ◽  
Vol 19 (7) ◽  
pp. 1716 ◽  
Author(s):  
Seungeun Chung ◽  
Jiyoun Lim ◽  
Kyoung Ju Noh ◽  
Gague Kim ◽  
Hyuntae Jeong

In this paper, we perform a systematic study about the on-body sensor positioning and data acquisition details for Human Activity Recognition (HAR) systems. We build a testbed that consists of eight body-worn Inertial Measurement Units (IMU) sensors and an Android mobile device for activity data collection. We develop a Long Short-Term Memory (LSTM) network framework to support training of a deep learning model on human activity data, which is acquired in both real-world and controlled environments. From the experiment results, we identify that activity data with sampling rate as low as 10 Hz from four sensors at both sides of wrists, right ankle, and waist is sufficient in recognizing Activities of Daily Living (ADLs) including eating and driving activity. We adopt a two-level ensemble model to combine class-probabilities of multiple sensor modalities, and demonstrate that a classifier-level sensor fusion technique can improve the classification performance. By analyzing the accuracy of each sensor on different types of activity, we elaborate custom weights for multimodal sensor fusion that reflect the characteristic of individual activities.


Sign in / Sign up

Export Citation Format

Share Document