scholarly journals A Navigation and Augmented Reality System for Visually Impaired People

Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3061
Author(s):  
Alice Lo Valvo ◽  
Daniele Croce ◽  
Domenico Garlisi ◽  
Fabrizio Giuliano ◽  
Laura Giarré ◽  
...  

In recent years, we have assisted with an impressive advance in augmented reality systems and computer vision algorithms, based on image processing and artificial intelligence. Thanks to these technologies, mainstream smartphones are able to estimate their own motion in 3D space with high accuracy. In this paper, we exploit such technologies to support the autonomous mobility of people with visual disabilities, identifying pre-defined virtual paths and providing context information, reducing the distance between the digital and real worlds. In particular, we present ARIANNA+, an extension of ARIANNA, a system explicitly designed for visually impaired people for indoor and outdoor localization and navigation. While ARIANNA is based on the assumption that landmarks, such as QR codes, and physical paths (composed of colored tapes, painted lines, or tactile pavings) are deployed in the environment and recognized by the camera of a common smartphone, ARIANNA+ eliminates the need for any physical support thanks to the ARKit library, which we exploit to build a completely virtual path. Moreover, ARIANNA+ adds the possibility for the users to have enhanced interactions with the surrounding environment, through convolutional neural networks (CNNs) trained to recognize objects or buildings and enabling the possibility of accessing contents associated with them. By using a common smartphone as a mediation instrument with the environment, ARIANNA+ leverages augmented reality and machine learning for enhancing physical accessibility. The proposed system allows visually impaired people to easily navigate in indoor and outdoor scenarios simply by loading a previously recorded virtual path and providing automatic guidance along the route, through haptic, speech, and sound feedback.

Electronics ◽  
2019 ◽  
Vol 8 (6) ◽  
pp. 697 ◽  
Author(s):  
Jinqiang Bai ◽  
Zhaoxiang Liu ◽  
Yimin Lin ◽  
Ye Li ◽  
Shiguo Lian ◽  
...  

Assistive devices for visually impaired people (VIP) which support daily traveling and improve social inclusion are developing fast. Most of them try to solve the problem of navigation or obstacle avoidance, and other works focus on helping VIP to recognize their surrounding objects. However, very few of them couple both capabilities (i.e., navigation and recognition). Aiming at the above needs, this paper presents a wearable assistive device that allows VIP to (i) navigate safely and quickly in unfamiliar environment, and (ii) to recognize the objects in both indoor and outdoor environments. The device consists of a consumer Red, Green, Blue and Depth (RGB-D) camera and an Inertial Measurement Unit (IMU), which are mounted on a pair of eyeglasses, and a smartphone. The device leverages the ground height continuity among adjacent image frames to segment the ground accurately and rapidly, and then search the moving direction according to the ground. A lightweight Convolutional Neural Network (CNN)-based object recognition system is developed and deployed on the smartphone to increase the perception ability of VIP and promote the navigation system. It can provide the semantic information of surroundings, such as the categories, locations, and orientations of objects. Human–machine interaction is performed through audio module (a beeping sound for obstacle alert, speech recognition for understanding the user commands, and speech synthesis for expressing semantic information of surroundings). We evaluated the performance of the proposed system through many experiments conducted in both indoor and outdoor scenarios, demonstrating the efficiency and safety of the proposed assistive system.


2020 ◽  
Author(s):  
Zaiyan Khan ◽  
Rishikesh Varvade ◽  
Jinan Fiaidhi

Sight is viewed as the most significant sense and the visually impaired individuals are seen upon with feel sorry for by others. Innovation encourages the visually impaired individuals to speak with nature, the correspondence procedure and the dispersal of data has gotten quick and on a more extensive scale to incorporate all pieces of the world which incredibly influenced to the human life, subsequently expanding the methods for amusement and comfort and diminished affliction and hardship in numerous things. We have surveyed the existing solutions meant for autonomous mobility for the visually impaired people. In this paper, we have proposed a novel structure, Smart Shoes with sensors installed in them to control an outwardly debilitated individual smoothly and to alarm him/her of the impediments that lay in front of him in his way. The structure is meant to build up a simple to utilise processing power of Arduino in conjunction with the object detection capability of ultrasonic sensor to oblige the extraordinary needs, used to manage the individual coextending the highlights of the Smart Shoes.


2020 ◽  
Author(s):  
Zaiyan Khan ◽  
Rishikesh Varvade ◽  
Jinan Fiaidhi

Sight is viewed as the most significant sense and the visually impaired individuals are seen upon with feel sorry for by others. Innovation encourages the visually impaired individuals to speak with nature, the correspondence procedure and the dispersal of data has gotten quick and on a more extensive scale to incorporate all pieces of the world which incredibly influenced to the human life, subsequently expanding the methods for amusement and comfort and diminished affliction and hardship in numerous things. We have surveyed the existing solutions meant for autonomous mobility for the visually impaired people. In this paper, we have proposed a novel structure, Smart Shoes with sensors installed in them to control an outwardly debilitated individual smoothly and to alarm him/her of the impediments that lay in front of him in his way. The structure is meant to build up a simple to utilise processing power of Arduino in conjunction with the object detection capability of ultrasonic sensor to oblige the extraordinary needs, used to manage the individual coextending the highlights of the Smart Shoes.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 170406-170418 ◽  
Author(s):  
Daniele Croce ◽  
Laura Giarre ◽  
Federica Pascucci ◽  
Ilenia Tinnirello ◽  
Giovanni Ettore Galioto ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document