A Robotic Prototype for Spraying and Pollinating Date Palm Trees

Author(s):  
Amir Shapiro ◽  
Eran Korkidi ◽  
Amit Rotenberg ◽  
Gilad Furst ◽  
Harel Namdar ◽  
...  

Spraying and pollinating date palm trees is currently done manually by a team of three workers from a platform lifted 18 meters or more above the ground. This method is extremely unsafe and many accidents have happened due to lack of stability when the platform is in a lifted position. Alternatively, date clusters are occasionally sprayed by a large pressurized sprayer directly from the ground, a method that is highly unselective and environmentally harmful. In this paper we present the concept of an automatic apparatus that can effectively and accurately spray and pollinate date clusters from a robotic apparatus mounted to a standard tractor operated by a single driver. The apparatus consists of a robotic arm and computer controlled sprayer, guided by a computer vision system that detects and localizes date clusters with a camera. This system will minimize risk of injury, significantly save manpower (from three to one person per team), and deliver the spray with maximum accuracy thereby reducing chemical disposure. A small scaled prototype has been built and is currently under preliminary experiments.

2017 ◽  
Vol 23 (4) ◽  
pp. 653-659 ◽  
Author(s):  
Rafael Vidal Aroca ◽  
Carlos E.H. Ventura ◽  
Igor De Mello ◽  
Tatiana F.P.A.T. Pazelli

Purpose This paper aims to present a monitoring system and the usage of a robotic arm to remove finished parts of a three-dimensional (3D) printer build plate, enabling 3D printers to continuously build a sequence of parts. Design/methodology/approach The system relies on a 2-degree of freedom planar manipulator. The moment to remove printed parts from the printer build plate can be determined based on direct communication with the 3D printer control software or using information from a computer vision system that applies background subtraction and Speeded up Robust Features methods. Findings The proposed system automatically detects the end of standard 3D print jobs and controls the robotic arm to remove the part. Research limitations/implications Lighting variation can deteriorate the response of the computer vision system, which can be minimized using a controlled illumination environment. In addition, the printer build plate edges must be free so the parts can slip off the printer build plate when the robot pushes them out. Practical implications The system enables a more practical and automatized usage of 3D printers, reducing the need of human operators. Social implications The proposed system can reduce work hours of laboratory personnel, as there is no need to remove the printed parts manually before another job starts. Originality/value Computer vision system monitors the printing process and the automation system that enables continuous sequential 3D printing of parts. A prototype is described, which can be easily replicated with low cost parts.


Author(s):  
Joy Iong-Zong Chen ◽  
Jen-Ting Chang

The study of a robotic arm copied with 3D-printer combines computer vision system with tracking algorithm is proposed in the paper. Moreover, the designing to the intelligent vehicle system with the integration of electromechanical for planning to apply it to the operations in various fields is presented too. The main purpose of this work tries to avoid the complicated process with traditional manual adjustment or teaching. It is expected to achieve the purpose that the robotic arm can grab the target automatically, classify the target and place it in the specified area, and even accurately realize the classification through training to distinguish the characteristics of the target. Eventually, the mechanical arm's movement behavior is able to be corrected through a real-time image data feedback control system. In words, with the experiment that the computer vision system is used to assist the robotic arm to detect the color and position of the target. By adding color features for algorithm training as well as through human-machine collaboration, which approves that the proposed algorithm has well known that the accuracy of target tracking definitely depends on both of two parameters include “object locations” and the “illustration direction” of light source. The difference will far from 75.2% to 89.0%.


2021 ◽  
pp. 105084
Author(s):  
Bojana Milovanovic ◽  
Ilija Djekic ◽  
Jelena Miocinovic ◽  
Bartosz G. Solowiej ◽  
Jose M. Lorenzo ◽  
...  

Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 343
Author(s):  
Kim Bjerge ◽  
Jakob Bonde Nielsen ◽  
Martin Videbæk Sepstrup ◽  
Flemming Helsing-Nielsen ◽  
Toke Thomas Høye

Insect monitoring methods are typically very time-consuming and involve substantial investment in species identification following manual trapping in the field. Insect traps are often only serviced weekly, resulting in low temporal resolution of the monitoring data, which hampers the ecological interpretation. This paper presents a portable computer vision system capable of attracting and detecting live insects. More specifically, the paper proposes detection and classification of species by recording images of live individuals attracted to a light trap. An Automated Moth Trap (AMT) with multiple light sources and a camera was designed to attract and monitor live insects during twilight and night hours. A computer vision algorithm referred to as Moth Classification and Counting (MCC), based on deep learning analysis of the captured images, tracked and counted the number of insects and identified moth species. Observations over 48 nights resulted in the capture of more than 250,000 images with an average of 5675 images per night. A customized convolutional neural network was trained on 2000 labeled images of live moths represented by eight different classes, achieving a high validation F1-score of 0.93. The algorithm measured an average classification and tracking F1-score of 0.71 and a tracking detection rate of 0.79. Overall, the proposed computer vision system and algorithm showed promising results as a low-cost solution for non-destructive and automatic monitoring of moths.


Metals ◽  
2021 ◽  
Vol 11 (3) ◽  
pp. 387
Author(s):  
Martin Choux ◽  
Eduard Marti Bigorra ◽  
Ilya Tyapin

The rapidly growing deployment of Electric Vehicles (EV) put strong demands on the development of Lithium-Ion Batteries (LIBs) but also into its dismantling process, a necessary step for circular economy. The aim of this study is therefore to develop an autonomous task planner for the dismantling of EV Lithium-Ion Battery pack to a module level through the design and implementation of a computer vision system. This research contributes to moving closer towards fully automated EV battery robotic dismantling, an inevitable step for a sustainable world transition to an electric economy. For the proposed task planner the main functions consist in identifying LIB components and their locations, in creating a feasible dismantling plan, and lastly in moving the robot to the detected dismantling positions. Results show that the proposed method has measurement errors lower than 5 mm. In addition, the system is able to perform all the steps in the order and with a total average time of 34 s. The computer vision, robotics and battery disassembly have been successfully unified, resulting in a designed and tested task planner well suited for product with large variations and uncertainties.


2019 ◽  
Vol 82 (1) ◽  
Author(s):  
Edicley Vander Machado ◽  
Priscila Cardoso Cristovam ◽  
Denise de Freitas ◽  
José Álvaro Pereira Gomes ◽  
Vagner Rogério dos Santos

Sign in / Sign up

Export Citation Format

Share Document