scholarly journals A Proposal of a Motion Measurement System to Support Visually Impaired People in Rehabilitation Using Low-Cost Inertial Sensors

Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 848
Author(s):  
Karla Miriam Reyes Leiva ◽  
Milagros Jaén-Vargas ◽  
Miguel Ángel Cuba ◽  
Sergio Sánchez Lara ◽  
José Javier Serrano Olmedo

The rehabilitation of a visually impaired person (VIP) is a systematic process where the person is provided with tools that allow them to deal with the impairment to achieve personal autonomy and independence, such as training for the use of the long cane as a tool for orientation and mobility (O&M). This process must be trained personally by specialists, leading to a limitation of human, technological and structural resources in some regions, especially those with economical narrow circumstances. A system to obtain information about the motion of the long cane and the leg using low-cost inertial sensors was developed to provide an overview of quantitative parameters such as sweeping coverage and gait analysis, that are currently visually analyzed during rehabilitation. The system was tested with 10 blindfolded volunteers in laboratory conditions following constant contact, two points touch, and three points touch travel techniques. The results indicate that the quantification system is reliable for measuring grip rotation, safety zone, sweeping amplitude and hand position using orientation angles with an accuracy of around 97.62%. However, a new method or an improvement of hardware must be developed to improve gait parameters’ measurements, since the step length measurement presented a mean accuracy of 94.62%. The system requires further development to be used as an aid in the rehabilitation process of the VIP. Now, it is a simple and low-cost technological aid that has the potential to improve the current practice of O&M.

2020 ◽  
Vol 24 (03) ◽  
pp. 515-520
Author(s):  
Vattumilli Komal Venugopal ◽  
Alampally Naveen ◽  
Rajkumar R ◽  
Govinda K ◽  
Jolly Masih

Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4767
Author(s):  
Karla Miriam Reyes Leiva ◽  
Milagros Jaén-Vargas ◽  
Benito Codina ◽  
José Javier Serrano Olmedo

A diverse array of assistive technologies have been developed to help Visually Impaired People (VIP) face many basic daily autonomy challenges. Inertial measurement unit sensors, on the other hand, have been used for navigation, guidance, and localization but especially for full body motion tracking due to their low cost and miniaturization, which have allowed the estimation of kinematic parameters and biomechanical analysis for different field of applications. The aim of this work was to present a comprehensive approach of assistive technologies for VIP that include inertial sensors as input, producing results on the comprehension of technical characteristics of the inertial sensors, the methodologies applied, and their specific role in each developed system. The results show that there are just a few inertial sensor-based systems. However, these sensors provide essential information when combined with optical sensors and radio signals for navigation and special application fields. The discussion includes new avenues of research, missing elements, and usability analysis, since a limitation evidenced in the selected articles is the lack of user-centered designs. Finally, regarding application fields, it has been highlighted that a gap exists in the literature regarding aids for rehabilitation and biomechanical analysis of VIP. Most of the findings are focused on navigation and obstacle detection, and this should be considered for future applications.


2018 ◽  
Vol 7 (3.12) ◽  
pp. 116
Author(s):  
N Vignesh ◽  
Meghachandra Srinivas Reddy.P ◽  
Nirmal Raja.G ◽  
Elamaram E ◽  
B Sudhakar

Eyes play important role in our day to day lives and are perhaps the most valuable gift we have. This world is visible to us because we are blessed with eyesight. But there are some people who lag this ability of visualizing these things. Due to this, they will undergo a lot of troubles o move comfortably in public places. Hence, wearable device should design for such visual impaired people. A smart shoe is wearable system design to provide directional information to visually impaired people. To provide smart and sensible navigation guidance to visually impaired people, the system has great potential especially when integrated with visual processing units. During the operation, the user is supposed to wear the shoes. When sensors will detect any obstacle, user will be informed through Android system being used by the user. The Smart Shoes along with the application on the Android system shall help the user in moving around independently.


2019 ◽  
Vol 16 (1) ◽  
pp. 13-32 ◽  
Author(s):  
Hana Porkertová

This article thematizes relations between visual impairment and urban space, drawing from the analytical perspective of actor-network theory (ANT). It traces the ways in which visually impaired people create specific connections with space and how they transform it. Urban space is configured for use by able-bodied persons, for whom movement within it is easy and seems to be disembodied. However, for those who defy the standardization of space, the materiality of movement is constantly present and visible, because the passages are difficult to make and are not ready in advance. These materialities, as well as the strategies that people use to make connections with urban space, differ according to the assemblages that visually impaired people create. A route is different with a cane, a human companion, a guide dog, or the use of a combination of such assistance; the visually impaired person pays attention to different clues, follows specific lines, and other information is important and available. Each configuration makes it possible or impossible to do something; this shows disability as dynamic, and demonstrates the collective nature of action, which is more visible and palpable in the case of a disabled person.


Author(s):  
Tejal Adep ◽  
Rutuja Nikam ◽  
Sayali Wanewe ◽  
Dr. Ketaki B. Naik

Blind people face the problem in daily life. They can't even walk without any aid. Many times they rely on others for help. Several technologies for the assistance of visually impaired people have been developed. Among the various technologies being utilized to assist the blind, Computer Vision-based solutions are emerging as one of the most promising options due to their affordability and accessibility. This paper proposes a system for visually impaired people. The proposed system aims to create a wearable visual aid for visually impaired people in which speech commands are accepted by the user. Its functionality addresses the identification of objects and signboards. This will help the visually impaired person to manage day-to-day activities and navigate through his/her surroundings. Raspberry Pi is used to implement artificial vision using python language on the Open CV platform.


2015 ◽  
Vol 5 (3) ◽  
pp. 801-804
Author(s):  
M. Abdul-Niby ◽  
M. Alameen ◽  
O. Irscheid ◽  
M. Baidoun ◽  
H. Mourtada

In this paper, we present a low cost hands-free detection and avoidance system designed to provide mobility assistance for visually impaired people. An ultrasonic sensor is attached to the jacket of the user and detects the obstacles in front. The information obtained is transferred to the user through audio messages and also by a vibration. The range of the detection is user-defined. A text-to-speech module is employed for the voice signal. The proposed obstacle avoidance device is cost effective, easy to use and easily upgraded.


Entropy ◽  
2020 ◽  
Vol 22 (9) ◽  
pp. 941
Author(s):  
Rakesh Chandra Joshi ◽  
Saumya Yadav ◽  
Malay Kishore Dutta ◽  
Carlos M. Travieso-Gonzalez

Visually impaired people face numerous difficulties in their daily life, and technological interventions may assist them to meet these challenges. This paper proposes an artificial intelligence-based fully automatic assistive technology to recognize different objects, and auditory inputs are provided to the user in real time, which gives better understanding to the visually impaired person about their surroundings. A deep-learning model is trained with multiple images of objects that are highly relevant to the visually impaired person. Training images are augmented and manually annotated to bring more robustness to the trained model. In addition to computer vision-based techniques for object recognition, a distance-measuring sensor is integrated to make the device more comprehensive by recognizing obstacles while navigating from one place to another. The auditory information that is conveyed to the user after scene segmentation and obstacle identification is optimized to obtain more information in less time for faster processing of video frames. The average accuracy of this proposed method is 95.19% and 99.69% for object detection and recognition, respectively. The time complexity is low, allowing a user to perceive the surrounding scene in real time.


2016 ◽  
Vol 2 (1) ◽  
pp. 727-730
Author(s):  
Nora Loepthien ◽  
Tanja Jehnichen ◽  
Josephine Hauser ◽  
Benjamin Schullcke ◽  
Knut Möller

AbstractThe aim of the project is the development of an aid for blind or visually impaired people, considering economic aspects as well as easy adaptability to various daily situations. Distance sensors were attached to a walking frame (rollator) to detect the distance to obstacles. The information from the sensors is transmitted to the user via tactile feedback. This is realized with a number of vibration motors which were located at the upper belly area of the subject. To test the functionality of the aid to the blind, a testing track with obstacles has been passed through by a number of volunteers. While passing the track five times the needed time to pass through, as well as the number of collisions, were noticed. The results showed a decline in the average time needed to pass though the testing track. This indicates a learning process of the operator to interpret the signals given by the tactile feedback.


Sign in / Sign up

Export Citation Format

Share Document