3D Indoor Environment Modeling by a Mobile Robot with Omnidirectional Stereo and Laser Range Finder

Author(s):  
Suguru Ikeda ◽  
Jun Miura
2010 ◽  
Vol 22 (1) ◽  
pp. 28-35 ◽  
Author(s):  
Takashi Ogino ◽  
◽  
Masahiro Tomono ◽  
Toshinari Akimoto ◽  
Akihiro Matsumoto ◽  
...  

This paper deals with map building from laser range finder measurement in an unknown indoor environment and its application to human following by an omnidirectional mobile robot. After reviewing basic strategies of human following by a mobile robot involving simultaneous acquisition of indoor map and robot location acquisition, we implemented “pseudo” odometry, rather than conventional odometry, for the omnidirectional mobile robot, using this information to improve scan-matching calculation accuracy. We then conducted experiments in which the robot followed a pedestrian. We confirmed that the robot could follow different pedestrian trajectories if walking was slow, and that our approach effectively improved scan matching calculation accuracy.


2006 ◽  
Vol 24 (5) ◽  
pp. 605-613 ◽  
Author(s):  
Shinichi Okusako ◽  
Shigeyuki Sakane

2021 ◽  
Vol 33 (1) ◽  
pp. 33-43
Author(s):  
Kazuhiro Funato ◽  
Ryosuke Tasaki ◽  
Hiroto Sakurai ◽  
Kazuhiko Terashima ◽  
◽  
...  

The authors have been developing a mobile robot to assist doctors in hospitals in managing medical tools and patient electronic medical records. The robot tracks behind a mobile medical worker while maintaining a constant distance from the worker. However, it was difficult to detect objects in the sensor’s invisible region, called occlusion. In this study, we propose a sensor fusion method to estimate the position of a robot tracking target indirectly by an inertial measurement unit (IMU) in addition to the direct measurement by an laser range finder (LRF) and develop a human tracking system to avoid occlusion by a mobile robot. Based on this, we perform detailed experimental verification of tracking a specified person to verify the validity of the proposed method.


Sensors ◽  
2020 ◽  
Vol 20 (14) ◽  
pp. 3948 ◽  
Author(s):  
Wenpeng Fu ◽  
Ran Liu ◽  
Heng Wang ◽  
Rashid Ali ◽  
Yongping He ◽  
...  

In an indoor environment, object identification and localization are paramount for human-object interaction. Visual or laser-based sensors can achieve the identification and localization of the object based on its appearance, but these approaches are computationally expensive and not robust against the environment with obstacles. Radio Frequency Identification (RFID) has a unique tag ID to identify the object, but it cannot accurately locate it. Therefore, in this paper, the data of RFID and laser range finder are fused for the better identification and localization of multiple dynamic objects in an indoor environment. The main method is to use the laser range finder to estimate the radial velocities of objects in a certain environment, and match them with the object’s radial velocities estimated by the RFID phase. The method also uses a fixed time series as “sliding time window” to find the cluster with the highest similarity of each RFID tag in each window. Moreover, the Pearson correlation coefficient (PCC) is used in the update stage of the particle filter (PF) to estimate the moving path of each cluster in order to improve the accuracy in a complex environment with obstacles. The experiments were verified by a SCITOS G5 robot. The results show that this method can achieve an matching rate of 90.18% and a localization accuracy of 0.33m in an environment with the presence of obstacles. This method effectively improves the matching rate and localization accuracy of multiple objects in indoor scenes when compared to the Bray-Curtis (BC) similarity matching-based approach as well as the particle filter-based approach.


Sign in / Sign up

Export Citation Format

Share Document