assistive robot
Recently Published Documents


TOTAL DOCUMENTS

389
(FIVE YEARS 142)

H-INDEX

23
(FIVE YEARS 5)

2021 ◽  
Vol 10 (4) ◽  
pp. 1-31
Author(s):  
Moojan Ghafurian ◽  
Jesse Hoey ◽  
Kerstin Dautenhahn

Intelligent assistive robots can enhance the quality of life of people with dementia and their caregivers. They can increase the independence of older adults, reduce tensions between a person with dementia and their caregiver, and increase social engagement. This article provides a review of assistive robots designed for and evaluated by persons with dementia. Assistive robots that only increased mobility or brain-computer interfaces were excluded. Google Scholar, IEEE Digital Library, PubMed, and ACM Digital Library were searched. A final set of 53 articles covering research in 16 different countries are reviewed. Assistive robots are categorized into five different applications and evaluated for their effectiveness, as well as the robots’ social and emotional capabilities. Our findings show that robots used in the context of therapy or for increasing engagement received the most attention in the literature, whereas the robots that assist by providing health guidance or help with an activity of daily living received relatively limited attention. PARO was the most commonly used robot in dementia care studies. The effectiveness of each assistive robot and the outcome of the studies are discussed, and particularly, the social/emotional capabilities of each assistive robot are summarized. Gaps in the research literature are identified and we provide directions for future work.


Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8414
Author(s):  
João Antonio Campos Panceri ◽  
Éberte Freitas ◽  
Josiany Carlos de Souza ◽  
Sheila da Luz Schreider ◽  
Eliete Caldeira ◽  
...  

This work introduces a new socially assistive robot termed MARIA T21 (meaning “Mobile Autonomous Robot for Interaction with Autistics”, with the addition of the acronym T21, meaning “Trisomy 21”, which is used to designate individuals with Down syndrome). This new robot is used in psychomotor therapies for children with Down syndrome (contributing to improve their proprioception, postural balance, and gait) as well as in psychosocial and cognitive therapies for children with autism spectrum disorder. The robot uses, as a novelty, an embedded mini-video projector able to project Serious Games on the floor or tables to make already-established therapies funnier to these children, thus creating a motivating and facilitating effect for both children and therapists. The Serious Games were developed in Python through the library Pygame, considering theoretical bases of behavioral psychology for these children, which are integrated into the robot through the robot operating system (ROS). Encouraging results from the child–robot interaction are shown, according to outcomes obtained from the application of the Goal Attainment Scale. Regarding the Serious Games, they were considered suitable based on both the “Guidelines for Game Design of Serious Games for Children” and the “Evaluation of the Psychological Bases” used during the games’ development. Thus, this pilot study seeks to demonstrate that the use of a robot as a therapeutic tool together with the concept of Serious Games is an innovative and promising tool to help health professionals in conducting therapies with children with autistic spectrum disorder and Down syndrome. Due to health issues imposed by the COVID-19 pandemic, the sample of children was limited to eight children (one child with typical development, one with Trisomy 21, both female, and six children with ASD, one girl and five boys), from 4 to 9 years of age. For the non-typically developing children, the inclusion criterion was the existence of a conclusive diagnosis and fulfillment of at least 1 year of therapy. The protocol was carried out in an infant psychotherapy room with three video cameras, supervised by a group of researchers and a therapist. The experiments were separated into four steps: The first stage was composed of a robot introduction followed by an approximation between robot and child to establish eye contact and assess proxemics and interaction between child/robot. In the second stage, the robot projected Serious Games on the floor, and emitted verbal commands, seeking to evaluate the child’s susceptibility to perform the proposed tasks. In the third stage, the games were performed for a certain time, with the robot sending messages of positive reinforcement to encourage the child to accomplish the game. Finally, in the fourth stage, the robot finished the games and said goodbye to the child, using messages aiming to build a closer relationship with the child.


2021 ◽  
Author(s):  
N. Shandybina ◽  
S. Ananyev ◽  
А. Aliev ◽  
I. Shalmiev ◽  
S. Kozureva ◽  
...  

Millions of people around the world suffer from disorders caused by injuries and diseases of the brain and spinal cord. The combination of brain-computer interfaces and neuromodulation technologies is a new approach that could revolutionize the treatment of these disorders. In this study, we tested the effectiveness of a technique in which a patient with a spinal cord injury first undergoes spinal cord stimulation and then participates in a rehabilitation session using a brain-computer interface based on the P300 principle, which decodes visual-motor transformation and uses an assistive robot that moves the patient’s arm, and virtual reality. All healthy participants of the study were able to combine these two techniques without any undesirable effects; studies on patients with spinal cord injury are ongoing. System integration of the two methods has been already performed, and in the future, upon completion of this work, the neural interface will be able to control the stimulation parameters. We propose such integrated systems as a new approach to neurorehabilitation. Key words: brain-computer interface, neuromodulation, spinal cord stimulation, spinal cord trauma, P300, visuomotor transformation.


Author(s):  
Md Samiul Haque Sunny ◽  
Md Ishrak Islam Zarif ◽  
Ivan Rulik ◽  
Javier Sanjuan ◽  
Mohammad Habibur Rahman ◽  
...  

Abstract Background Building control architecture that balances the assistive manipulation systems with the benefits of direct human control is a crucial challenge of human–robot collaboration. It promises to help people with disabilities more efficiently control wheelchair and wheelchair-mounted robot arms to accomplish activities of daily living. Methods In this study, our research objective is to design an eye-tracking assistive robot control system capable of providing targeted engagement and motivating individuals with a disability to use the developed method for self-assistance activities of daily living. The graphical user interface is designed and integrated with the developed control architecture to achieve the goal. Results We evaluated the system by conducting a user study. Ten healthy participants performed five trials of three manipulation tasks using the graphical user interface and the developed control framework. The 100% success rate on task performance demonstrates the effectiveness of our system for individuals with motor impairments to control wheelchair and wheelchair-mounted assistive robotic manipulators. Conclusions We demonstrated the usability of using this eye-gaze system to control a robotic arm mounted on a wheelchair in activities of daily living for people with disabilities. We found high levels of acceptance with higher ratings in the evaluation of the system with healthy participants.


Author(s):  
Mubashar Nawaz ◽  
◽  
Xianhua Li ◽  
Sohaib Latif ◽  
Sadaf Irshad ◽  
...  

More than 110 million people in this world are facing some kind of disability, for which they experience difficulty while eating food. Eating Assistive Robots could meet the needs of the elderly and people with upper limb disabilities or dysfunctions in gaining independence in eating. We are researching making a robot, which can assist the disabled in eating their meals. Our Eating Assistive Robot will detect the face of the disabled and process it for whether his/her mouth is opened or closed. Our robot will put a pre-prepared replaceable spoon of food in his/her mouth iteratively until the food lasts in the food container. The methodology we used for it i.e. firstly there is a live camera feed through which we are detecting human faces, after this, a library of Affectiva calculates how much mouth is open. We have set a certain threshold after which the program starts the stepper motor which brings the pre-filled spoon of food into the mouth of the disabled.


2021 ◽  
Vol 8 ◽  
Author(s):  
Joseline Raja Vora ◽  
Ameer Helmi ◽  
Christine Zhan ◽  
Eliora Olivares ◽  
Tina Vu ◽  
...  

Background: Play is critical for children’s physical, cognitive, and social development. Technology-based toys like robots are especially of interest to children. This pilot study explores the affordances of the play area provided by developmentally appropriate toys and a mobile socially assistive robot (SAR). The objective of this study is to assess the role of the SAR on physical activity, play behavior, and toy-use behavior of children during free play.Methods: Six children (5 females, Mage = 3.6 ± 1.9 years) participated in the majority of our pilot study’s seven 30-minute-long weekly play sessions (4 baseline and 3 intervention). During baseline sessions, the SAR was powered off. During intervention sessions, the SAR was teleoperated to move in the play area and offered rewards of lights, sounds, and bubbles to children. Thirty-minute videos of the play sessions were annotated using a momentary time sampling observation system. Mean percentage of time spent in behaviors of interest in baseline and intervention sessions were calculated. Paired-Wilcoxon signed rank tests were conducted to assess differences between baseline and intervention sessions.Results: There was a significant increase in children’s standing (∼15%; Z = −2.09; p = 0.037) and a tendency for less time sitting (∼19%; Z = −1.89; p = 0.059) in the intervention phase as compared to the baseline phase. There was also a significant decrease (∼4.5%, Z = −2.70; p = 0.007) in peer interaction play and a tendency for greater (∼4.5%, Z = −1.89; p = 0.059) interaction with adults in the intervention phase as compared to the baseline phase. There was a significant increase in children’s interaction with the robot (∼11.5%, Z = −2.52; p = 0.012) in the intervention phase as compared to the baseline phase.Conclusion: These results may indicate that a mobile SAR provides affordances through rewards that elicit children’s interaction with the SAR and more time standing in free play. This pilot study lays a foundation for exploring the role of SARs in inclusive play environments for children with and without mobility disabilities in real-world settings like day-care centers and preschools.


2021 ◽  
Vol 2089 (1) ◽  
pp. 012056
Author(s):  
K.A. Sunitha ◽  
Ganti Sri Giri Sai Suraj ◽  
G Atchyut Sriram ◽  
N Savitha Sai

Abstract The proposed robot aims to serve as a personal assistant for visually impaired people in obstacle avoidance, in identifying the person (known or unknown) with whom they are interacting with and in navigating. The robot has a special feature in truly locating the subject’s location using GPS. The novel feature of this robot is to identify people with whom the subject interacts. Facial detection and identification in real-time has been a challenge and achieved with accurate image processing using viola jones and SURF algorithms. An obstacle avoidance design has been implanted in the system with many sensors to guide in the correct path. Hence, the robot is a fusion of providing the best of the comfort and safety with minimal cost.


Sign in / Sign up

Export Citation Format

Share Document