Rethinking modeling Alzheimer's disease progression from a multi-task learning perspective with deep recurrent neural network

2021 ◽  
Vol 138 ◽  
pp. 104935
Author(s):  
Wei Liang ◽  
Kai Zhang ◽  
Peng Cao ◽  
Xiaoli Liu ◽  
Jinzhu Yang ◽  
...  
Sensors ◽  
2020 ◽  
Vol 20 (24) ◽  
pp. 7212
Author(s):  
Jungryul Seo ◽  
Teemu H. Laine ◽  
Gyuhwan Oh ◽  
Kyung-Ah Sohn

As the number of patients with Alzheimer’s disease (AD) increases, the effort needed to care for these patients increases as well. At the same time, advances in information and sensor technologies have reduced caring costs, providing a potential pathway for developing healthcare services for AD patients. For instance, if a virtual reality (VR) system can provide emotion-adaptive content, the time that AD patients spend interacting with VR content is expected to be extended, allowing caregivers to focus on other tasks. As the first step towards this goal, in this study, we develop a classification model that detects AD patients’ emotions (e.g., happy, peaceful, or bored). We first collected electroencephalography (EEG) data from 30 Korean female AD patients who watched emotion-evoking videos at a medical rehabilitation center. We applied conventional machine learning algorithms, such as a multilayer perceptron (MLP) and support vector machine, along with deep learning models of recurrent neural network (RNN) architectures. The best performance was obtained from MLP, which achieved an average accuracy of 70.97%; the RNN model’s accuracy reached only 48.18%. Our study results open a new stream of research in the field of EEG-based emotion detection for patients with neurological disorders.


2001 ◽  
Vol 112 (8) ◽  
pp. 1378-1387 ◽  
Author(s):  
A.A. Petrosian ◽  
D.V. Prokhorov ◽  
W. Lajara-Nanson ◽  
R.B. Schiffer

Sign in / Sign up

Export Citation Format

Share Document