Smart Glasses Application System for Visually Impaired People Based on Deep Learning

Author(s):  
Jyun-You Lin ◽  
Chi-Lin Chiang ◽  
Meng-Jin Wu ◽  
Chih-Chiung Yao ◽  
Ming-Chiao Chen
Author(s):  
G. Touya ◽  
F. Brisebard ◽  
F. Quinton ◽  
A. Courtial

Abstract. Visually impaired people cannot use classical maps but can learn to use tactile relief maps. These tactile maps are crucial at school to learn geography and history as well as the other students. They are produced manually by professional transcriptors in a very long and costly process. A platform able to generate tactile maps from maps scanned from geography textbooks could be extremely useful to these transcriptors, to fasten their production. As a first step towards such a platform, this paper proposes a method to infer the scale and the content of the map from its image. We used convolutional neural networks trained with a few hundred maps from French geography textbooks, and the results show promising results to infer labels about the content of the map (e.g. ”there are roads, cities and administrative boundaries”), and to infer the extent of the map (e.g. a map of France or of Europe).


Author(s):  
Puru Malhotra and Vinay Kumar Saini

he paper is aimed at the design of a mobility assistive device to help the visually impaired. The traditional use of a walking stick proposes its own drawbacks and limitations. Our research is motivated by the inability of the visually impaired people to ambulate and we have made an attempt to restore their independence and reduce the trouble of carrying a stick around. We offer a hands-free wearable glass which finds it utility in real-time navigation. The design of the smart glasses includes the integration of various sensors with raspberry pi. The paper presents a detailed account of the various components and the structural design of the glasses. The novelty of our work lies in providing a complete pipeline for analysis of surroundings in real-time and hence a better solution for navigating during the day to day activities using audio instructions as output.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 63144-63161 ◽  
Author(s):  
Tuyen Danh Pham ◽  
Chanhum Park ◽  
Dat Tien Nguyen ◽  
Ganbayar Batchuluun ◽  
Kang Ryoung Park

Sign in / Sign up

Export Citation Format

Share Document