Fuzzy navigation strategy: application to two distinct autonomous mobile robots

Robotica ◽  
1997 ◽  
Vol 15 (6) ◽  
pp. 609-615 ◽  
Author(s):  
Mahieddine Benreguieg ◽  
Philippe Hoppenot ◽  
Hichem Maaref ◽  
Etienne Colle ◽  
Claude Barret

Most motion controls of mobile robots are based on the classical scheme of planning-navigation-piloting. The navigation function, the main part of which consists in obstacle avoidance, has to react with the shortest response time. The real-time constraint hardly limits the complexity of sensor data processing. The described navigator is built around fuzzy logic controllers. Besides the well-known possibility of taking into account human know-how, the approach provides several contributions: a low sensitivity to erroneous or inaccurate measures and, if the inputs of the controllers are normalised, an effective portability on various platform. To show these advantages, the same fuzzy navigator has been implemented on two mobile robots. Their mechanical structures are close, except for size and the sensing system.

Sensors ◽  
2018 ◽  
Vol 18 (8) ◽  
pp. 2730 ◽  
Author(s):  
Varuna De Silva ◽  
Jamie Roche ◽  
Ahmet Kondoz

Autonomous robots that assist humans in day to day living tasks are becoming increasingly popular. Autonomous mobile robots operate by sensing and perceiving their surrounding environment to make accurate driving decisions. A combination of several different sensors such as LiDAR, radar, ultrasound sensors and cameras are utilized to sense the surrounding environment of autonomous vehicles. These heterogeneous sensors simultaneously capture various physical attributes of the environment. Such multimodality and redundancy of sensing need to be positively utilized for reliable and consistent perception of the environment through sensor data fusion. However, these multimodal sensor data streams are different from each other in many ways, such as temporal and spatial resolution, data format, and geometric alignment. For the subsequent perception algorithms to utilize the diversity offered by multimodal sensing, the data streams need to be spatially, geometrically and temporally aligned with each other. In this paper, we address the problem of fusing the outputs of a Light Detection and Ranging (LiDAR) scanner and a wide-angle monocular image sensor for free space detection. The outputs of LiDAR scanner and the image sensor are of different spatial resolutions and need to be aligned with each other. A geometrical model is used to spatially align the two sensor outputs, followed by a Gaussian Process (GP) regression-based resolution matching algorithm to interpolate the missing data with quantifiable uncertainty. The results indicate that the proposed sensor data fusion framework significantly aids the subsequent perception steps, as illustrated by the performance improvement of a uncertainty aware free space detection algorithm.


Author(s):  
Margot M. E. Neggers ◽  
Raymond H. Cuijpers ◽  
Peter A. M. Ruijten ◽  
Wijnand A. IJsselsteijn

AbstractAutonomous mobile robots that operate in environments with people are expected to be able to deal with human proxemics and social distances. Previous research investigated how robots can approach persons or how to implement human-aware navigation algorithms. However, experimental research on how robots can avoid a person in a comfortable way is largely missing. The aim of the current work is to experimentally determine the shape and size of personal space of a human passed by a robot. In two studies, both a humanoid as well as a non-humanoid robot were used to pass a person at different sides and distances, after which they were asked to rate their perceived comfort. As expected, perceived comfort increases with distance. However, the shape was not circular: passing at the back of a person is more uncomfortable compared to passing at the front, especially in the case of the humanoid robot. These results give us more insight into the shape and size of personal space in human–robot interaction. Furthermore, they can serve as necessary input to human-aware navigation algorithms for autonomous mobile robots in which human comfort is traded off with efficiency goals.


Sign in / Sign up

Export Citation Format

Share Document