Fall Detection with Bioradar Using Wavelet Analysis and Deep Learning

Author(s):  
Lesya Anishchenko ◽  
Evgeniya Smirnova
2020 ◽  
Vol 20 (16) ◽  
pp. 9408-9416
Author(s):  
Xiaoye Qian ◽  
Huan Chen ◽  
Haotian Jiang ◽  
Justin Green ◽  
Haoyou Cheng ◽  
...  

Author(s):  
Sagar Chhetri ◽  
Abeer Alsadoon ◽  
Thair Al‐Dala'in ◽  
P. W. C. Prasad ◽  
Tarik A. Rashid ◽  
...  

Author(s):  
Henrique D.P. dos Santos ◽  
Amanda P. Silva ◽  
Maria Carolina O. Maciel ◽  
Haline Maria V. Burin ◽  
Janete S. Urbanetto ◽  
...  

With the emergence of new concepts like smart hospitals, video surveillance cameras should be introduced in each room of the hospital for the purpose of safety and security. These surveillance cameras can also be used to provide assistance to patients and hospital staff. In particular, a real-time fall of a patient can be detected with the help of these cameras and accordingly, assistance can be provided to them. Different models have already been developed by researchers to detect a human fall using a camera. This paper proposes a vision based deep learning model to detect a human fall. Along with this model, two mathematical based models have also been proposed which uses pre-trained YOLO FCNN and Faster R-CNN architecture to detect the human fall. At the end of this paper, a comparison study has been done on these models to specify which method provides the most accurate results


2020 ◽  
Vol 184 ◽  
pp. 105265
Author(s):  
Rubén Delgado-Escaño ◽  
Francisco M. Castro ◽  
Julián R. Cózar ◽  
Manuel J. Marín-Jiménez ◽  
Nicolás Guil ◽  
...  

Author(s):  
Neeraj Varshney

Old people, who are living alone at home face serious problem of Falls while moving from one place to another and sometime life threading also. In order to prevent this situation, several fall monitoring systems based on sensor data were proposed. However, there was an issue of misclassification to identify the fall as daily life activities and also routine activity as fall. Towards this end, a deep learning based model is proposed in this paper by using the data of heart rate, BP and sugar level to identify fall along with other daily life activities like walking, running jogging etc. For accurate identification of fall accidents, a publicly accessible data collection and a lightly weighted CNN model are used. The model reports proposed and 98.21 % precision.


Author(s):  
Anahita Shojaei-Hashemi ◽  
Panos Nasiopoulos ◽  
James J. Little ◽  
Mahsa T. Pourazad

2021 ◽  
Vol 11 (24) ◽  
pp. 11938
Author(s):  
Denis Zherdev ◽  
Larisa Zherdeva ◽  
Sergey Agapov ◽  
Anton Sapozhnikov ◽  
Artem Nikonorov ◽  
...  

Human poses and the behaviour estimation for different activities in (virtual reality/augmented reality) VR/AR could have numerous beneficial applications. Human fall monitoring is especially important for elderly people and for non-typical activities with VR/AR applications. There are a lot of different approaches to improving the fidelity of fall monitoring systems through the use of novel sensors and deep learning architectures; however, there is still a lack of detail and diverse datasets for training deep learning fall detectors using monocular images. The issues with synthetic data generation based on digital human simulation were implemented and examined using the Unreal Engine. The proposed pipeline provides automatic “playback” of various scenarios for digital human behaviour simulation, and the result of a proposed modular pipeline for synthetic data generation of digital human interaction with the 3D environments is demonstrated in this paper. We used the generated synthetic data to train the Mask R-CNN-based segmentation of the falling person interaction area. It is shown that, by training the model with simulation data, it is possible to recognize a falling person with an accuracy of 97.6% and classify the type of person’s interaction impact. The proposed approach also allows for covering a variety of scenarios that can have a positive effect at a deep learning training stage in other human action estimation tasks in an VR/AR environment.


Sign in / Sign up

Export Citation Format

Share Document