Prototype Development of a Low-Cost Vibro-Tactile Navigation Aid for the Visually Impaired

Author(s):  
Vanessa Petrausch ◽  
Thorsten Schwarz ◽  
Rainer Stiefelhagen
2020 ◽  
Vol 24 (03) ◽  
pp. 515-520
Author(s):  
Vattumilli Komal Venugopal ◽  
Alampally Naveen ◽  
Rajkumar R ◽  
Govinda K ◽  
Jolly Masih

Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 848
Author(s):  
Karla Miriam Reyes Leiva ◽  
Milagros Jaén-Vargas ◽  
Miguel Ángel Cuba ◽  
Sergio Sánchez Lara ◽  
José Javier Serrano Olmedo

The rehabilitation of a visually impaired person (VIP) is a systematic process where the person is provided with tools that allow them to deal with the impairment to achieve personal autonomy and independence, such as training for the use of the long cane as a tool for orientation and mobility (O&M). This process must be trained personally by specialists, leading to a limitation of human, technological and structural resources in some regions, especially those with economical narrow circumstances. A system to obtain information about the motion of the long cane and the leg using low-cost inertial sensors was developed to provide an overview of quantitative parameters such as sweeping coverage and gait analysis, that are currently visually analyzed during rehabilitation. The system was tested with 10 blindfolded volunteers in laboratory conditions following constant contact, two points touch, and three points touch travel techniques. The results indicate that the quantification system is reliable for measuring grip rotation, safety zone, sweeping amplitude and hand position using orientation angles with an accuracy of around 97.62%. However, a new method or an improvement of hardware must be developed to improve gait parameters’ measurements, since the step length measurement presented a mean accuracy of 94.62%. The system requires further development to be used as an aid in the rehabilitation process of the VIP. Now, it is a simple and low-cost technological aid that has the potential to improve the current practice of O&M.


Author(s):  
Romain Nicot ◽  
Edwige Hurteloup ◽  
Sébastien Joachim ◽  
Charles Druelle ◽  
Jean-Marc Levaillant

2017 ◽  
Vol 139 (03) ◽  
pp. 36-41
Author(s):  
John Kosowatz

This article provides an overview of high-tech sensors, visual detection software, and mobile computing power applications, which are being developed to enable visually impaired people to navigate. By adapting technology developed for robots, automobiles, and other products, researchers and developers are creating wearable devices that can aid the visually impaired as they navigate through their daily routines—even identifying people and places. The Eyeronman system, developed by NYU’s Visuomotor Integration Laboratory and Tactile Navigation Tools, combines a sensor-laden outer garment or belt with a vest studded with vibrating actuators. The sensors detect objects in the immediate environment and relay their locations via buzzes on the wearer's torso. OrCam’s, a computer vision company in Jerusalem, team of programmers, computer engineers, and hardware designers have developed MyEye device, which attaches to the temple of a pair of eyeglasses. The device instructs the user on how to store items in memory, including things such as credit cards and faces of friends and family.


2015 ◽  
Vol 5 (3) ◽  
pp. 801-804
Author(s):  
M. Abdul-Niby ◽  
M. Alameen ◽  
O. Irscheid ◽  
M. Baidoun ◽  
H. Mourtada

In this paper, we present a low cost hands-free detection and avoidance system designed to provide mobility assistance for visually impaired people. An ultrasonic sensor is attached to the jacket of the user and detects the obstacles in front. The information obtained is transferred to the user through audio messages and also by a vibration. The range of the detection is user-defined. A text-to-speech module is employed for the voice signal. The proposed obstacle avoidance device is cost effective, easy to use and easily upgraded.


2020 ◽  
Vol 137 ◽  
pp. 27-36 ◽  
Author(s):  
Zuria Bauer ◽  
Alejandro Dominguez ◽  
Edmanuel Cruz ◽  
Francisco Gomez-Donoso ◽  
Sergio Orts-Escolano ◽  
...  

Sensors ◽  
2019 ◽  
Vol 19 (17) ◽  
pp. 3783 ◽  
Author(s):  
Petsiuk ◽  
Pearce

Nineteen million Americans have significant vision loss. Over 70% of these are not employed full-time, and more than a quarter live below the poverty line. Globally, there are 36 million blind people, but less than half use white canes or more costly commercial sensory substitutions. The quality of life for visually impaired people is hampered by the resultant lack of independence. To help alleviate these challenges this study reports on the development of a low-cost, open-source ultrasound-based navigational support system in the form of a wearable bracelet to allow people with the lost vision to navigate, orient themselves in their surroundings and avoid obstacles when moving. The system can be largely made with digitally distributed manufacturing using low-cost 3-D printing/milling. It conveys point-distance information by utilizing the natural active sensing approach and modulates measurements into haptic feedback with various vibration patterns within the four-meter range. It does not require complex calibrations and training, consists of the small number of available and inexpensive components, and can be used as an independent addition to traditional tools. Sighted blindfolded participants successfully demonstrated the device for nine primary everyday navigation and guidance tasks including indoor and outdoor navigation and avoiding collisions with other pedestrians.


Sign in / Sign up

Export Citation Format

Share Document