Activity Recognition Based on Streaming Sensor Data for Assisted Living in Smart Homes

Author(s):  
Beichen Chen ◽  
Zhong Fan ◽  
Fengming Cao
2017 ◽  
Vol 47 (3) ◽  
pp. 368-379 ◽  
Author(s):  
Joseph Rafferty ◽  
Chris D. Nugent ◽  
Jun Liu ◽  
Liming Chen

Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 879 ◽  
Author(s):  
Uwe Köckemann ◽  
Marjan Alirezaie ◽  
Jennifer Renoux ◽  
Nicolas Tsiftes ◽  
Mobyen Uddin Ahmed ◽  
...  

As research in smart homes and activity recognition is increasing, it is of ever increasing importance to have benchmarks systems and data upon which researchers can compare methods. While synthetic data can be useful for certain method developments, real data sets that are open and shared are equally as important. This paper presents the E-care@home system, its installation in a real home setting, and a series of data sets that were collected using the E-care@home system. Our first contribution, the E-care@home system, is a collection of software modules for data collection, labeling, and various reasoning tasks such as activity recognition, person counting, and configuration planning. It supports a heterogeneous set of sensors that can be extended easily and connects collected sensor data to higher-level Artificial Intelligence (AI) reasoning modules. Our second contribution is a series of open data sets which can be used to recognize activities of daily living. In addition to these data sets, we describe the technical infrastructure that we have developed to collect the data and the physical environment. Each data set is annotated with ground-truth information, making it relevant for researchers interested in benchmarking different algorithms for activity recognition.


2009 ◽  
Vol 5 (3) ◽  
pp. 236-252 ◽  
Author(s):  
Xin Hong ◽  
Chris Nugent ◽  
Maurice Mulvenna ◽  
Sally McClean ◽  
Bryan Scotney ◽  
...  

2019 ◽  
Vol 16 (2) ◽  
pp. 678-690 ◽  
Author(s):  
Chao-Lin Wu ◽  
Ya-Hung Chen ◽  
Yi-Wei Chien ◽  
Ming-Je Tsai ◽  
Ting-Ying Li ◽  
...  

2018 ◽  
Vol 28 (10) ◽  
pp. 2933-2945 ◽  
Author(s):  
Fadi Al Machot ◽  
Ahmad Haj Mosa ◽  
Mouhannad Ali ◽  
Kyandoghere Kyamakya

2014 ◽  
Vol 53 (03) ◽  
pp. 149-151 ◽  
Author(s):  
L. Schöpe ◽  
P. Knaup

SummaryIntroduction: This editorial is part of the Focus Theme of Methods of Information in Medicine on “Using Data from Ambient Assisted Living and Smart Homes in Electronic Health Records”.Background: To increase efficiency in the health care of the future, data from innovative technology like it is used for ambient assisted living (AAL) or smart homes should be available for individual health decisions. Integrating and aggregating data from different medical devices and health records enables a comprehensive view on health data.Objectives: The objective of this paper is to present examples of the state of the art in research on information management that leads to a sustainable use and long-term storage of health data provided by innovative assistive technologies in daily living.Results: Current research deals with the perceived usefulness of sensor data, the participatory design of visual displays for presenting monitoring data, and communication architectures for integrating sensor data from home health care environments with health care providers either via a regional health record bank or via a telemedical center.Conclusions: Integrating data from AAL systems and smart homes with data from electronic patient or health records is still in an early stage. Several projects are in an advanced conceptual phase, some of them exploring feasibility with the help of prototypes. General comprehensive solutions are hardly available and should become a major issue of medical informatics research in the near future.


Sensors ◽  
2021 ◽  
Vol 21 (3) ◽  
pp. 768
Author(s):  
Caetano Mazzoni Ranieri ◽  
Scott MacLeod ◽  
Mauro Dragone ◽  
Patricia Amancio Vargas ◽  
Roseli Aparecida Francelin Romero 

Worldwide demographic projections point to a progressively older population. This fact has fostered research on Ambient Assisted Living, which includes developments on smart homes and social robots. To endow such environments with truly autonomous behaviours, algorithms must extract semantically meaningful information from whichever sensor data is available. Human activity recognition is one of the most active fields of research within this context. Proposed approaches vary according to the input modality and the environments considered. Different from others, this paper addresses the problem of recognising heterogeneous activities of daily living centred in home environments considering simultaneously data from videos, wearable IMUs and ambient sensors. For this, two contributions are presented. The first is the creation of the Heriot-Watt University/University of Sao Paulo (HWU-USP) activities dataset, which was recorded at the Robotic Assisted Living Testbed at Heriot-Watt University. This dataset differs from other multimodal datasets due to the fact that it consists of daily living activities with either periodical patterns or long-term dependencies, which are captured in a very rich and heterogeneous sensing environment. In particular, this dataset combines data from a humanoid robot’s RGBD (RGB + depth) camera, with inertial sensors from wearable devices, and ambient sensors from a smart home. The second contribution is the proposal of a Deep Learning (DL) framework, which provides multimodal activity recognition based on videos, inertial sensors and ambient sensors from the smart home, on their own or fused to each other. The classification DL framework has also validated on our dataset and on the University of Texas at Dallas Multimodal Human Activities Dataset (UTD-MHAD), a widely used benchmark for activity recognition based on videos and inertial sensors, providing a comparative analysis between the results on the two datasets considered. Results demonstrate that the introduction of data from ambient sensors expressively improved the accuracy results.


Sensors ◽  
2020 ◽  
Vol 20 (6) ◽  
pp. 1779 ◽  
Author(s):  
Hans W. Guesgen

Activity recognition plays a central role in many sensor-based applications, such as smart homes for instance. Given a stream of sensor data, the goal is to determine the activities that triggered the sensor data. This article shows how spatial information can be used to improve the process of recognizing activities in smart homes. The sensors that are used in smart homes are in most cases installed in fixed locations, which means that when a particular sensor is triggered, we know approximately where the activity takes place. However, since different sensors may be involved in different occurrences of the same type of activity, the set of sensors associated with a particular activity is not precisely defined. In this article, we use rough sets rather than standard sets to denote the sensors involved in an activity to model, which enables us to deal with this imprecision. Using publicly available data sets, we will demonstrate that rough sets can adequately capture useful information to assist with the activity recognition process. We will also show that rough sets lend themselves to creating Explainable Artificial Intelligence (XAI).


Sign in / Sign up

Export Citation Format

Share Document