Evaluation of an Accessible, Real-Time, and Infrastructure-Free Indoor Navigation System by Users Who Are Blind in the Mall of America

2019 ◽  
Vol 113 (2) ◽  
pp. 140-155 ◽  
Author(s):  
Nicholas A. Giudice ◽  
William E. Whalen ◽  
Timothy H. Riehle ◽  
Shane M. Anderson ◽  
Stacy A. Doore

Introduction: This article describes an evaluation of MagNav, a speech-based, infrastructure-free indoor navigation system. The research was conducted in the Mall of America, the largest shopping mall in the United States, to empirically investigate the impact of memory load on route-guidance performance. Method: Twelve participants who are blind and 12 age-matched sighted controls participated in the study. Comparisons are made for route-guidance performance between use of updated, real-time route instructions (system-aided condition) and a system-unaided (memory-based condition) where the same instructions were only provided in advance of route travel. The sighted controls (who navigated under typical visual perception but used the system for route guidance) represent a best case comparison benchmark with the blind participants who used the system. Results: Results across all three test measures provide compelling behavioral evidence that blind navigators receiving real-time verbal information from the MagNav system performed route travel faster (navigation time), more accurately (fewer errors in reaching the destination), and more confidently (fewer requests for bystander assistance) compared to conditions where the same route information was only available to them in advance of travel. In addition, no statistically reliable differences were observed for any measure in the system-aided conditions between the blind and sighted participants. Posttest survey results corroborate the empirical findings, further supporting the efficacy of the MagNav system. Discussion: This research provides compelling quantitative and qualitative evidence showing the utility of an infrastructure-free, low-memory demand navigation system for supporting route guidance through complex indoor environments and supports the theory that functionally equivalent navigation performance is possible when access to real-time environmental information is available, irrespective of visual status. Implications for designers and practitioners: Findings provide insight for the importance of developers of accessible navigation systems to employ interfaces that minimize memory demands.

Indoor Navigation system is gaining lot of importance these days. It is particularly important to locate places inside a large university campus, Airport, Railway station or Museum. There are many mobile applications developed recently using different techniques. The work proposed in this paper is focusing on the need of visually challenged people while navigating in indoor environment. The approach proposed here implements the system using Beacon. The application developed with the system gives audio guidance to the user for navigation.


2014 ◽  
Vol 68 (2) ◽  
pp. 253-273 ◽  
Author(s):  
Shifei Liu ◽  
Mohamed Maher Atia ◽  
Tashfeen B. Karamat ◽  
Aboelmagd Noureldin

Autonomous Unmanned Ground Vehicles (UGVs) require a reliable navigation system that works in all environments. However, indoor navigation remains a challenge because the existing satellite-based navigation systems such as the Global Positioning System (GPS) are mostly unavailable indoors. In this paper, a tightly-coupled integrated navigation system that integrates two dimensional (2D) Light Detection and Ranging (LiDAR), Inertial Navigation System (INS), and odometry is introduced. An efficient LiDAR-based line features detection/tracking algorithm is proposed to estimate the relative changes in orientation and displacement of the vehicle. Furthermore, an error model of INS/odometry system is derived. LiDAR-estimated orientation/position changes are fused by an Extended Kalman Filter (EKF) with those predicted by INS/odometry using the developed error model. Errors estimated by EKF are used to correct the position and orientation of the vehicle and to compensate for sensor errors. The proposed system is verified through simulation and real experiment on an UGV equipped with LiDAR, MEMS-based IMU, and encoder. Both simulation and experimental results showed that sensor errors are accurately estimated and the drifts of INS are significantly reduced leading to navigation performance of sub-metre accuracy.


2017 ◽  
Vol 2017 ◽  
pp. 1-11 ◽  
Author(s):  
Xiaoji Niu ◽  
Tong Yu ◽  
Jian Tang ◽  
Le Chang

Multisensors (LiDAR/IMU/CAMERA) integrated Simultaneous Location and Mapping (SLAM) technology for navigation and mobile mapping in a GNSS-denied environment, such as indoor areas, dense forests, or urban canyons, becomes a promising solution. An online (real-time) version of such system can extremely extend its applications, especially for indoor mobile mapping. However, the real-time response issue of multisensors is a big challenge for an online SLAM system, due to the different sampling frequencies and processing time of different algorithms. In this paper, an online Extended Kalman Filter (EKF) integrated algorithm of LiDAR scan matching and IMU mechanization for Unmanned Ground Vehicle (UGV) indoor navigation system is introduced. Since LiDAR scan matching is considerably more time consuming than the IMU mechanism, the real-time synchronous issue is solved via a one-step-error-state-transition method in EKF. Stationary and dynamic field tests had been performed using a UGV platform along typical corridor of office building. Compared to the traditional sequential postprocessed EKF algorithm, the proposed method can significantly mitigate the time delay of navigation outputs under the premise of guaranteeing the positioning accuracy, which can be used as an online navigation solution for indoor mobile mapping.


2020 ◽  
Vol 13 (1) ◽  
pp. 27
Author(s):  
Shaaban Ali Salman ◽  
Qais A. Khasawneh ◽  
Mohammad A. Jaradat ◽  
Mansour Y. Alramlawi

Sign in / Sign up

Export Citation Format

Share Document