scholarly journals OFM-SLAM: A Visual Semantic SLAM for Dynamic Indoor Environments

2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Xiong Zhao ◽  
Tao Zuo ◽  
Xinyu Hu

Most of the current visual Simultaneous Localization and Mapping (SLAM) algorithms are designed based on the assumption of a static environment, and their robustness and accuracy in the dynamic environment do not behave well. The reason is that moving objects in the scene will cause the mismatch of features in the pose estimation process, which further affects its positioning and mapping accuracy. In the meantime, the three-dimensional semantic map plays a key role in mobile robot navigation, path planning, and other tasks. In this paper, we present OFM-SLAM: Optical Flow combining MASK-RCNN SLAM, a novel visual SLAM for semantic mapping in dynamic indoor environments. Firstly, we use the Mask-RCNN network to detect potential moving objects which can generate masks of dynamic objects. Secondly, an optical flow method is adopted to detect dynamic feature points. Then, we combine the optical flow method and the MASK-RCNN for full dynamic points’ culling, and the SLAM system is able to track without these dynamic points. Finally, the semantic labels obtained from MASK-RCNN are mapped to the point cloud for generating a three-dimensional semantic map that only contains the static parts of the scenes and their semantic information. We evaluate our system in public TUM datasets. The results of our experiments demonstrate that our system is more effective in dynamic scenarios, and the OFM-SLAM can estimate the camera pose more accurately and acquire a more precise localization in the high dynamic environment.

Author(s):  
Yingnian Wu ◽  
Qi Yang ◽  
Xiaohang Zhou

The theory and technology of human–machine coordination and natural interaction have a wide range of application prospect in future smart factories. This paper elaborates on the design and implementation of a body-following wheeled robot system based on Kinect, as well as the use of gesture recognition function to enhance the interactive performance. An improved optical flow method is put forward to obtain the direction and speed of the target movement. The smoothing parameters in traditional optical flow are replaced by variables. The new smoothing parameter is related to the local gradient value. Compared with the traditional optical flow method, it can reflect the status of moving objects more clearly, reduce noise and ensure real-time performance, solving the problem of tracking state oscillation caused by the skeleton node drifts when the target is occluded. The experiment on the wheeled robot confirms that the system can accomplish the tracking task in a preferable way.


2013 ◽  
Vol 376 ◽  
pp. 455-460
Author(s):  
Wei Zhu ◽  
Li Tian ◽  
Fang Di ◽  
Jian Li Li ◽  
Ke Jie Li

Optical flow method is an important and valid method in the field of detection and tracking of moving objects for robot inspection system. Due to the traditional Horn-Schunck optical flow method and Lucas-Kanade optical flow method cannot meet the demands of real-time and accuracy simultaneously, an improved optical flow method based on Gaussian image pyramid is proposed. The layered structure of the images can be obtained by desampling of the original sequential images so that the motion with the high speed can be changed into continuous motion with lower speed. Then the optical flows of corner points of the lowest layer will be calculated by the LK method and be delivered to the upper layer and so on. Thus the estimated optical flow vectors of the original sequential images will be obtained. In this way, the requirement of accuracy and real time could be met for robotic moving obstacle recognition.


2000 ◽  
Vol 24 (4) ◽  
pp. 531-538 ◽  
Author(s):  
Nobuhiko Hata ◽  
Arya Nabavi ◽  
William M. Wells ◽  
Simon K. Warfield ◽  
Ron Kikinis ◽  
...  

2007 ◽  
Vol 188 (3) ◽  
pp. W276-W280 ◽  
Author(s):  
Drew A. Torigian ◽  
Warren B. Gefter ◽  
John D. Affuso ◽  
Kiarash Emami ◽  
Lawrence Dougherty

Sign in / Sign up

Export Citation Format

Share Document