scholarly journals Relatively Lazy: Indoor-Outdoor Navigation Using Vision and GNSS

Author(s):  
Benjamin Congram ◽  
Timothy D. Barfoot
Keyword(s):  
2010 ◽  
Vol 29 (8) ◽  
pp. 958-980 ◽  
Author(s):  
Gabe Sibley ◽  
Christopher Mei ◽  
Ian Reid ◽  
Paul Newman

Author(s):  
Yoshinobu UZAWA ◽  
Shigemichi MATSUZAKI ◽  
Hiroaki MASUZAWA ◽  
Jun MIURA

Sensors ◽  
2020 ◽  
Vol 20 (8) ◽  
pp. 2385 ◽  
Author(s):  
George Dimas ◽  
Dimitris E. Diamantis ◽  
Panagiotis Kalozoumis ◽  
Dimitris K. Iakovidis

Every day, visually challenged people (VCP) face mobility restrictions and accessibility limitations. A short walk to a nearby destination, which for other individuals is taken for granted, becomes a challenge. To tackle this problem, we propose a novel visual perception system for outdoor navigation that can be evolved into an everyday visual aid for VCP. The proposed methodology is integrated in a wearable visual perception system (VPS). The proposed approach efficiently incorporates deep learning, object recognition models, along with an obstacle detection methodology based on human eye fixation prediction using Generative Adversarial Networks. An uncertainty-aware modeling of the obstacle risk assessment and spatial localization has been employed, following a fuzzy logic approach, for robust obstacle detection. The above combination can translate the position and the type of detected obstacles into descriptive linguistic expressions, allowing the users to easily understand their location in the environment and avoid them. The performance and capabilities of the proposed method are investigated in the context of safe navigation of VCP in outdoor environments of cultural interest through obstacle recognition and detection. Additionally, a comparison between the proposed system and relevant state-of-the-art systems for the safe navigation of VCP, focused on design and user-requirements satisfaction, is performed.


Sign in / Sign up

Export Citation Format

Share Document