Assistive Device for Locomotion of Visually Impaired and Physically Challenged People

2016 ◽  
Vol 852 ◽  
pp. 806-811 ◽  
Author(s):  
Sathish ◽  
Rajagopalan Nithya ◽  
N. Roshini ◽  
S. Nivethithaa

Dependency for mobility of physically challenged and visually impaired people is a major issue to be focused. To bring a safe and independent movement, we have designed and developed a mobility aid to assist them in locomotion. The device is designed using CATIA software. Our ideology is to control the navigation of the device by two modes. In the first mode, the navigation of the unit is governed by the voice command given by the user namely right, left, forward, reverse and stop. In the second mode, the device renders a reliable movement in the known environment which is achieved by feeding in a pre-defined layout. The navigation modes are regulated by a control unit.

2015 ◽  
Vol 5 (3) ◽  
pp. 801-804
Author(s):  
M. Abdul-Niby ◽  
M. Alameen ◽  
O. Irscheid ◽  
M. Baidoun ◽  
H. Mourtada

In this paper, we present a low cost hands-free detection and avoidance system designed to provide mobility assistance for visually impaired people. An ultrasonic sensor is attached to the jacket of the user and detects the obstacles in front. The information obtained is transferred to the user through audio messages and also by a vibration. The range of the detection is user-defined. A text-to-speech module is employed for the voice signal. The proposed obstacle avoidance device is cost effective, easy to use and easily upgraded.


Author(s):  
M. S. Heetha ◽  
M. Shenbagapriya ◽  
M. Bharanidharan

Visually impaired people face many challenges in the society; particularly students with visual impairments face unique challenges in the education environment. They struggle a lot to access the information, so to resolve this obstacle in reading and to allow the visually impaired students to fully access and participate in the curriculum with the greatest possible level of independence, a Braille transliteration system using VLSI is designed. Here Braille input is given to FPGA Virtex-4 kit via Braille keyboard. The Braille language is converted into English language by decoding logic in VHDL/Verilog and then the corresponding alphabet letter is converted into speech signal with the help of the algorithm. Speaker is used for the voice output. This project allows the visually impaired people to get literate also the person can get a conformation about what is being typed, every time that character is being pressed, this prevents the occurrence of mistakes.


Electronics ◽  
2019 ◽  
Vol 8 (6) ◽  
pp. 697 ◽  
Author(s):  
Jinqiang Bai ◽  
Zhaoxiang Liu ◽  
Yimin Lin ◽  
Ye Li ◽  
Shiguo Lian ◽  
...  

Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human–machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system.


Author(s):  
Puru Malhotra and Vinay Kumar Saini

he paper is aimed at the design of a mobility assistive device to help the visually impaired. The traditional use of a walking stick proposes its own drawbacks and limitations. Our research is motivated by the inability of the visually impaired people to ambulate and we have made an attempt to restore their independence and reduce the trouble of carrying a stick around. We offer a hands-free wearable glass which finds it utility in real-time navigation. The design of the smart glasses includes the integration of various sensors with raspberry pi. The paper presents a detailed account of the various components and the structural design of the glasses. The novelty of our work lies in providing a complete pipeline for analysis of surroundings in real-time and hence a better solution for navigating during the day to day activities using audio instructions as output.


Author(s):  
Syed Tehzeeb Alam ◽  
Sonal Shrivastava ◽  
Syed Tanzim Alam ◽  
R. Sasikala ◽  

Sign in / Sign up

Export Citation Format

Share Document