Auditory augmented reality: Object sonification for the visually impaired

Author(s):  
Flavio Ribeiro ◽  
Dinei Florencio ◽  
Philip A. Chou ◽  
Zhengyou Zhang
Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3061
Author(s):  
Alice Lo Valvo ◽  
Daniele Croce ◽  
Domenico Garlisi ◽  
Fabrizio Giuliano ◽  
Laura Giarré ◽  
...  

In recent years, we have assisted with an impressive advance in augmented reality systems and computer vision algorithms, based on image processing and artificial intelligence. Thanks to these technologies, mainstream smartphones are able to estimate their own motion in 3D space with high accuracy. In this paper, we exploit such technologies to support the autonomous mobility of people with visual disabilities, identifying pre-defined virtual paths and providing context information, reducing the distance between the digital and real worlds. In particular, we present ARIANNA+, an extension of ARIANNA, a system explicitly designed for visually impaired people for indoor and outdoor localization and navigation. While ARIANNA is based on the assumption that landmarks, such as QR codes, and physical paths (composed of colored tapes, painted lines, or tactile pavings) are deployed in the environment and recognized by the camera of a common smartphone, ARIANNA+ eliminates the need for any physical support thanks to the ARKit library, which we exploit to build a completely virtual path. Moreover, ARIANNA+ adds the possibility for the users to have enhanced interactions with the surrounding environment, through convolutional neural networks (CNNs) trained to recognize objects or buildings and enabling the possibility of accessing contents associated with them. By using a common smartphone as a mediation instrument with the environment, ARIANNA+ leverages augmented reality and machine learning for enhancing physical accessibility. The proposed system allows visually impaired people to easily navigate in indoor and outdoor scenarios simply by loading a previously recorded virtual path and providing automatic guidance along the route, through haptic, speech, and sound feedback.


2021 ◽  
Vol 14 (2) ◽  
pp. 125-151
Author(s):  
Siddharth Kalra ◽  
Sarika Jain ◽  
Amit Agarwal

This paper proposes to create an augmented reality interface for the visually impaired, enabling a way of haptically interacting with the computer system by creating a virtual workstation, facilitating a natural and intuitive way to accomplish a multitude of computer-based tasks (such as emailing, word processing, storing and retrieving files from the computer, making a phone call, searching the web, etc.). The proposed system utilizes a combination of a haptic glove device, a gesture-based control system, and an augmented reality computer interface which creates an immersive interaction between the blind user and the computer. The gestures are recognized, and the user is provided with audio and vibratory haptic feedbacks. This user interface allows the user to actually “touch, feel, and physically interact” with digital controls and virtual real estate of a computer system. A test of applicability was conducted which showcased promising positive results.


2012 ◽  
Vol 24 (2) ◽  
pp. 163-178 ◽  
Author(s):  
Brian F.G. Katz ◽  
Florian Dramas ◽  
Gaëtan Parseihian ◽  
Olivier Gutierrez ◽  
Slim Kammoun ◽  
...  

2012 ◽  
Vol 16 (4) ◽  
pp. 253-269 ◽  
Author(s):  
Brian F. G. Katz ◽  
Slim Kammoun ◽  
Gaëtan Parseihian ◽  
Olivier Gutierrez ◽  
Adrien Brilhault ◽  
...  

2021 ◽  
Vol 13 (17) ◽  
pp. 9973
Author(s):  
Edgar Herberto Medina-Sanchez ◽  
Miroslava Mikusova ◽  
Mauro Callejas-Cuervo

An increasing availability and reliability of open-source geographical resources, options in design of mobile applications together with smartphones of a high quality, featuring top cameras and number of sensors, bring us an extraordinary opportunity to provide the visually impaired people with relevant and comprehensible information on their vicinity, and thus to improve their mobility in a sustainable environment. The paper presents an interactive tool based on a mobile application created for mobile devices with Android operation system, and on using the augmented reality. It is a tool to support safe and efficient mobility of blind people and people with severe visual limitations in a sustainable urban environment. The essential benefit from using this tool lies in preventing risky, possibly dangerous and hardly accessible places. The first part briefly presents the problem of the visually impaired including the forms of the visual impairment, personal and economic costs for the entire society and the importance of improving the mobility of this group of people. The second part of the paper introduces the current state of the problem being solved as well as some basic tools which were developed to bring the surrounding environment closer to the visually impaired. Further, the process of the mobile application development is described. The application is meant to indicate information on places where the visually impaired users of the application are present while walking in an external environment (including the distance to their destination); the pilot testing of the application by a selected groups of the visually impaired is introduced, too.


Sign in / Sign up

Export Citation Format

Share Document