cluttered environment
Recently Published Documents


TOTAL DOCUMENTS

333
(FIVE YEARS 95)

H-INDEX

20
(FIVE YEARS 3)

2022 ◽  
pp. 1-18
Author(s):  
Binghua Shi ◽  
Yixin Su ◽  
Cheng Lian ◽  
Chang Xiong ◽  
Yang Long ◽  
...  

Abstract Recognition of obstacle type based on visual sensors is important for navigation by unmanned surface vehicles (USV), including path planning, obstacle avoidance, and reactive control. Conventional detection techniques may fail to distinguish obstacles that are similar in visual appearance in a cluttered environment. This work proposes a novel obstacle type recognition approach that combines a dilated operator with the deep-level features map of ResNet50 for autonomous navigation. First, visual images are collected and annotated from various different scenarios for USV test navigation. Second, the deep learning model, based on a dilated convolutional neural network, is set and trained. Dilated convolution allows the whole network to learn deep features with increased receptive field and further improves the performance of obstacle type recognition. Third, a series of evaluation parameters are utilised to evaluate the obtained model, such as the mean average precision (mAP), missing rate and detection speed. Finally, some experiments are designed to verify the accuracy of the proposed approach using visual images in a cluttered environment. Experimental results demonstrate that the dilated convolutional neural network obtains better recognition performance than the other methods, with an mAP of 88%.


Author(s):  
Abhishek K. Kashyap ◽  
Anish Pandey ◽  
Dayal R. Parhi ◽  
Surjeet Singh Gour

2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Photchara Ratsamee ◽  
Yasushi Mae ◽  
Kazuto Kamiyama ◽  
Mitsuhiro Horade ◽  
Masaru Kojima ◽  
...  

AbstractPeople with disabilities, such as patients with motor paralysis conditions, lack independence and cannot move most parts of their bodies except for their eyes. Supportive robot technology is highly beneficial in supporting these types of patients. We propose a gaze-informed location-based (or gaze-based) object segmentation, which is a core module of successful patient-robot interaction in an object-search task (i.e., a situation when a robot has to search for and deliver a target object to the patient). We have introduced the concepts of gaze tracing (GT) and gaze blinking (GB), which are integrated into our proposed object segmentation technique, to yield the benefit of an accurate visual segmentation of unknown objects in a complex scene. Gaze tracing information can be used as a clue as to where the target object is located in a scene. Then, gaze blinking can be used to confirm the position of the target object. The effectiveness of our proposed method has been demonstrated using a humanoid robot in experiments with different types of highly cluttered scenes. Based on the limited gaze guidance from the user, we achieved an 85% F-score of unknown object segmentation in an unknown environment.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Chittaranjan Paital ◽  
Saroj Kumar ◽  
Manoj Kumar Muni ◽  
Dayal R. Parhi ◽  
Prasant Ranjan Dhal

PurposeSmooth and autonomous navigation of mobile robot in a cluttered environment is the main purpose of proposed technique. That includes localization and path planning of mobile robot. These are important aspects of the mobile robot during autonomous navigation in any workspace. Navigation of mobile robots includes reaching the target from the start point by avoiding obstacles in a static or dynamic environment. Several techniques have already been proposed by the researchers concerning navigational problems of the mobile robot still no one confirms the navigating path is optimal.Design/methodology/approachTherefore, the modified grey wolf optimization (GWO) controller is designed for autonomous navigation, which is one of the intelligent techniques for autonomous navigation of wheeled mobile robot (WMR). GWO is a nature-inspired algorithm, which mainly mimics the social hierarchy and hunting behavior of wolf in nature. It is modified to define the optimal positions and better control over the robot. The motion from the source to target in the highly cluttered environment by negotiating obstacles. The controller is authenticated by the approach of V-REP simulation software platform coupled with real-time experiment in the laboratory by using Khepera-III robot.FindingsDuring experiments, it is observed that the proposed technique is much efficient in motion control and path planning as the robot reaches its target position without any collision during its movement. Further the simulation through V-REP and real-time experimental results are recorded and compared against each corresponding results, and it can be seen that the results have good agreement as the deviation in the results is approximately 5% which is an acceptable range of deviation in motion planning. Both the results such as path length and time taken to reach the target is recorded and shown in respective tables.Originality/valueAfter literature survey, it may be said that most of the approach is implemented on either mathematical convergence or in mobile robot, but real-time experimental authentication is not obtained. With a lack of clear evidence regarding use of MGWO (modified grey wolf optimization) controller for navigation of mobile robots in both the environment, such as in simulation platform and real-time experimental platforms, this work would serve as a guiding link for use of similar approaches in other forms of robots.


Drones ◽  
2021 ◽  
Vol 5 (4) ◽  
pp. 107
Author(s):  
Xishuang Zhao ◽  
Jingzheng Chong ◽  
Xiaohan Qi ◽  
Zhihua Yang

Autonomous navigation of micro aerial vehicles in unknown environments not only requires exploring their time-varying surroundings, but also ensuring the complete safety of flights at all times. The current research addresses estimation of the potential exploration value neglect of safety issues, especially in situations with a cluttered environment and no prior knowledge. To address this issue, we propose a vision object-oriented autonomous navigation method for environment exploration, which develops a B-spline function-based local trajectory re-planning algorithm by extracting spatial-structure information and selecting temporary target points. The proposed method is evaluated in a variety of cluttered environments, such as forests, building areas, and mines. The experimental results show that the proposed autonomous navigation system can effectively complete the global trajectory, during which an appropriate safe distance could always be maintained from multiple obstacles in the environment.


BMC Biology ◽  
2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Mor Taub ◽  
Yossi Yovel

Abstract Background Learning to adapt to changes in the environment is highly beneficial. This is especially true for echolocating bats that forage in diverse environments, moving between open spaces to highly complex ones. Bats are known for their ability to rapidly adjust their sensing according to auditory information gathered from the environment within milliseconds but can they also benefit from longer adaptive processes? In this study, we examined adult bats’ ability to slowly adapt their sensing strategy to a new type of environment they have never experienced for such long durations, and to then maintain this learned echolocation strategy over time. Results We show that over a period of weeks, Pipistrellus kuhlii bats gradually adapt their pre-takeoff echolocation sequence when moved to a constantly cluttered environment. After adopting this improved strategy, the bats retained an ability to instantaneously use it when placed back in a similarly cluttered environment, even after spending many months in a significantly less cluttered environment. Conclusions We demonstrate long-term adaptive flexibility in sensory acquisition in adult animals. Our study also gives further insight into the importance of sensory planning in the initiation of a precise sensorimotor behavior such as approaching for landing.


Sign in / Sign up

Export Citation Format

Share Document