Research on Automatic Cleaning Robot Based on Machine Vision

2014 ◽  
Vol 539 ◽  
pp. 648-652
Author(s):  
Zhi Guo Pan

The development and application of the machine vision technology is greatly liberating the human labor force and improved the production automation level and the situation of human life, which has very broad application prospects. The intelligent empty bottle inspection robot this paper studies is a typical application of the machine vision in the industrial detection. This paper mainly introduces the concept of machine vision, some important technology related to automatic cleaning robots and application of the machine vision in the production of all areas of life.

2018 ◽  
Vol 2018 ◽  
pp. 1-16 ◽  
Author(s):  
Muhammad Ilyas ◽  
Shi Yuyao ◽  
Rajesh Elara Mohan ◽  
Manojkumar Devarassu ◽  
Manivannan Kalimuthu

The mechanical, electrical, and autonomy aspects of designing a novel, modular, and reconfigurable cleaning robot, dubbed as sTetro (stair Tetro), are presented. The developed robotic platform uses a vertical conveyor mechanism to reconfigure itself and is capable of navigating over flat surfaces as well as staircases, thus significantly extending the automated cleaning capabilities as compared to conventional home cleaning robots. The mechanical design and system architecture are introduced first, followed by a detailed description of system modelling and controller design efforts in sTetro. An autonomy algorithm is also proposed for self-reconfiguration, locomotion, and autonomous navigation of sTetro in the controlled environment, for example, in homes/offices with a flat floor and a straight staircase. A staircase recognition algorithm is presented to distinguish between the surrounding environment and the stairs. The misalignment detection technique of the robot with a front staircase riser is also given, and a feedback from the IMU sensor for misalignment corrective measures is provided. The experiments performed with the sTetro robot demonstrated the efficacy and validity of the developed system models, control, and autonomy approaches.


2018 ◽  
Vol 8 (12) ◽  
pp. 2398 ◽  
Author(s):  
Shunsuke Nansai ◽  
Keichi Onodera ◽  
Prabakaran Veerajagadheswar ◽  
Mohan Rajesh Elara ◽  
Masami Iwase

Façade cleaning in high-rise buildings has always been considered a hazardous task when carried out by labor forces. Even though numerous studies have focused on the development of glass façade cleaning systems, the available technologies in this domain are limited and their performances are broadly affected by the frames that connect the glass panels. These frames generally act as a barrier for the glass façade cleaning robots to cross over from one glass panel to another, which leads to a performance degradation in terms of area coverage. We present a new class of façade cleaning robot with a biped mechanism that is able overcome these obstacles to maximize its area coverage. The developed robot uses active suction cups to adhere to glass walls and adopts mechanical linkage to navigate the glass surface to perform cleaning. This research addresses the design challenges in realizing the developed robot. Its control system consists of inverse kinematics, a fifth polynomial interpolation, and sequential control. Experiments were conducted in a real scenario, and the results indicate that the developed robot achieves significantly higher coverage performance by overcoming both negative and positive obstacles in a glass panel.


2018 ◽  
Vol 8 (12) ◽  
pp. 2649 ◽  
Author(s):  
Balakrishnan Ramalingam ◽  
Anirudh Lakshmanan ◽  
Muhammad Ilyas ◽  
Anh Le ◽  
Mohan Elara

Debris detection and classification is an essential function for autonomous floor-cleaning robots. It enables floor-cleaning robots to identify and avoid hard-to-clean debris, specifically large liquid spillage debris. This paper proposes a debris-detection and classification scheme for an autonomous floor-cleaning robot using a deep Convolutional Neural Network (CNN) and Support Vector Machine (SVM) cascaded technique. The SSD (Single-Shot MultiBox Detector) MobileNet CNN architecture is used for classifying the solid and liquid spill debris on the floor through the captured image. Then, the SVM model is employed for binary classification of liquid spillage regions based on size, which helps floor-cleaning devices to identify the larger liquid spillage debris regions, considered as hard-to-clean debris in this work. The experimental results prove that the proposed technique can efficiently detect and classify the debris on the floor and achieves 95.5% percent classification accuracy. The cascaded approach takes approximately 71 milliseconds for the entire process of debris detection and classification, which implies that the proposed technique is suitable for deploying in real-time selective floor-cleaning applications.


2009 ◽  
Vol 18 (7) ◽  
pp. 929-941 ◽  
Author(s):  
Je-Keun Oh ◽  
Giho Jang ◽  
Semin Oh ◽  
Jeong Ho Lee ◽  
Byung-Ju Yi ◽  
...  

Sensors ◽  
2020 ◽  
Vol 20 (5) ◽  
pp. 1483 ◽  
Author(s):  
Manuel Vega-Heredia ◽  
Ilyas Muhammad ◽  
Sriharsha Ghanta ◽  
Vengadesh Ayyalusami ◽  
Siti Aisyah ◽  
...  

Glass-façade-cleaning robots are an emerging class of service robots. This kind of cleaning robot is designed to operate on vertical surfaces, for which tracking the position and orientation becomes more challenging. In this article, we have presented a glass-façade-cleaning robot, Mantis v2, who can shift from one window panel to another like any other in the market. Due to the complexity of the panel shifting, we proposed and evaluated different methods for estimating its orientation using different kinds of sensors working together on the Robot Operating System (ROS). For this application, we used an onboard Inertial Measurement Unit (IMU), wheel encoders, a beacon-based system, Time-of-Flight (ToF) range sensors, and an external vision sensor (camera) for angular position estimation of the Mantis v2 robot. The external camera is used to monitor the robot’s operation and to track the coordinates of two colored markers attached along the longitudinal axis of the robot to estimate its orientation angle. ToF lidar sensors are attached on both sides of the robot to detect the window frame. ToF sensors are used for calculating the distance to the window frame; differences between beam readings are used to calculate the orientation angle of the robot. Differential drive wheel encoder data are used to estimate the robot’s heading angle on a 2D façade surface. An integrated heading angle estimation is also provided by using simple fusion techniques, i.e., a complementary filter (CF) and 1D Kalman filter (KF) utilizing the IMU sensor’s raw data. The heading angle information provided by different sensory systems is then evaluated in static and dynamic tests against an off-the-shelf attitude and heading reference system (AHRS). It is observed that ToF sensors work effectively from 0 to 30 degrees, beacons have a delay up to five seconds, and the odometry error increases according to the navigation distance due to slippage and/or sliding on the glass. Among all tested orientation sensors and methods, the vision sensor scheme proved to be better, with an orientation angle error of less than 0.8 degrees for this application. The experimental results demonstrate the efficacy of our proposed techniques in this orientation tracking, which has never applied in this specific application of cleaning robots.


2014 ◽  
Vol 568-570 ◽  
pp. 994-1000
Author(s):  
Han Li ◽  
Shao Jun Liu ◽  
Ku Wang ◽  
Xia Liu

There are a great amount of electronic meters equipped in the distribution substations, which were traditionally monitored by operators. On-site monitoring for risk assessment of these meters is very important. In this paper, we presented an advanced machine vision based automatic meter detection method toward the development of an online automatic meter reading intelligent inspection robot in substation. Firstly, the image received from the inspection robot was enhanced using histogram equalization. Then, the image was segmented into two parts based on the threshold obtained by Otsu’s method. Using these two parts, and the whole enhanced image, circular Hough transformation was applied on these three images and detected the circle with highest probability on them. The normalized correlation coefficients were calculated between the corresponding areas of those three circles from three images and the template image of SF6meter. Finally, the circle with highest correlation coefficient, which was higher than a certain threshold, was determined to be the meter. If it is lower than the threshold, the algorithm would decide that no meter was found in the image. The method was tested with 222 images obtained in one substation in Xi’an, Shanxi, China, and an 87.4% accuracy was achieved using these images, which indicated the potential of this method.


2020 ◽  
Author(s):  
Khandaker Foysal Haque ◽  
Rifat Zabin ◽  
Kumar Yelamarthi ◽  
Prasanth Yanambaka ◽  
Ahmed Abdelgawad

<div>Waste collection and management is an integrated</div><div>part of both city and village life. Lack of optimized and efficient waste collection system vastly affect public health and costs more. The prevailing traditional waste collection system is neither optimized nor efficient. Internet of Things (IoT) has been playing a great role in making human life easier by making systems smart, adequate and self sufficient. Thus, this paper proposes an IoT based efficient waste collection system with smart bins. It does real-time monitoring of the waste bins and determines which bins are to emptied in every cycle of waste collection. The system</div><div>also presents an enhanced navigation system that shows the best route to collect wastes from the selected bins. Four waste bins are assumed in the city of Mount Pleasant, Michigan at random location. The proposed system decreases the travel distance by 30.76% on an average in the assumed scenario, compared to the traditional waste collection system. Thus it reduces the fuel cost and human labor making the system optimized and efficient by enabling real-time monitoring and enhanced navigation.</div>


Sensors ◽  
2021 ◽  
Vol 21 (18) ◽  
pp. 6096
Author(s):  
Ash Wan Yaw Sang ◽  
Chee Gen Moo ◽  
S. M. Bhagya P. Samarakoon ◽  
M. A. Viraj J. Muthugala ◽  
Mohan Rajesh Elara

During a viral outbreak, such as COVID-19, autonomously operated robots are in high demand. Robots effectively improve the environmental concerns of contaminated surfaces in public spaces, such as airports, public transport areas and hospitals, that are considered high-risk areas. Indoor spaces walls made up most of the indoor areas in these public spaces and can be easily contaminated. Wall cleaning and disinfection processes are therefore critical for managing and mitigating the spread of viruses. Consequently, wall cleaning robots are preferred to address the demands. A wall cleaning robot needs to maintain a close and consistent distance away from a given wall during cleaning and disinfection processes. In this paper, a reconfigurable wall cleaning robot with autonomous wall following ability is proposed. The robot platform, Wasp, possess inter-reconfigurability, which enables it to be physically reconfigured into a wall-cleaning robot. The wall following ability has been implemented using a Fuzzy Logic System (FLS). The design of the robot and the FLS are presented in the paper. The platform and the FLS are tested and validated in several test cases. The experimental outcomes validate the real-world applicability of the proposed wall following method for a wall cleaning robot.


Sign in / Sign up

Export Citation Format

Share Document