A 3D Computer Vision System for Automatic Detection of Sheep Standing and Lying Behaviour

2018 ◽  
Author(s):  
Keni Ren ◽  
Grete H.M. Jørgensen ◽  
Inger Hansen ◽  
Johannes Karlsson
2000 ◽  
Author(s):  
Xiang Peng ◽  
Weiqiang Shi ◽  
Zonghua Zhang ◽  
Xiaotang Hu

Robotica ◽  
1995 ◽  
Vol 13 (4) ◽  
pp. 327-337 ◽  
Author(s):  
B. Preising ◽  
T. C. Hisa

SummaryPresent day robot systems are manufactured to perform within industry accepted tolerances. However, to use such systems for tasks requiring high precision, various methods of robot calibration are generally required. These procedures can improve the accuracy of a robot within a small volume of the robot's workspace. The objective of this paper is to demonstrate the use of a single camera 3D computer vision system as a position sensor in order to perform robot calibration. A vision feedback scheme, termed Vision-guided Robot Control (VRC), is described which can improve the accuracy of a robot in an on-line iterative manner. This system demonstrates the advantage that can be achieved by a Cartesian space robot control scheme when end effector position/orientation are actually sensed instead ofcalculated from the kinematic equations. The degree of accuracy is determined by setting a tolerance level for each of the six robot Cartesian space coordinates. In general, a small tolerance level requires a large number of iterations in order to position the end effector, and a large tolerance level requires fewer iterations. The viability of using a vision system for robot calibration is demonstrated by experimentally showing that the accuracy of a robot can be drastically improved. In addition, the vision system can also be used to determine the repeatability and accuracy of a robot in a simple, efficient, and quick manner. Experimental work with an IBM Electric Drive Robot (EDR) and the proposed vision system produced a 97 and a 145 fold improvement in the position and orientation accuracy of the robot, respectively.


2018 ◽  
Vol 173 ◽  
pp. 4-10 ◽  
Author(s):  
Oron Nir ◽  
Yisrael Parmet ◽  
Daniel Werner ◽  
Gaby Adin ◽  
Ilan Halachmi

PLoS ONE ◽  
2013 ◽  
Vol 8 (6) ◽  
pp. e67640 ◽  
Author(s):  
Giuseppe Bianco ◽  
Vincenzo Botte ◽  
Laurent Dubroca ◽  
Maurizio Ribera d’Alcalà ◽  
Maria Grazia Mazzocchi

2021 ◽  
pp. 105084
Author(s):  
Bojana Milovanovic ◽  
Ilija Djekic ◽  
Jelena Miocinovic ◽  
Bartosz G. Solowiej ◽  
Jose M. Lorenzo ◽  
...  

Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 343
Author(s):  
Kim Bjerge ◽  
Jakob Bonde Nielsen ◽  
Martin Videbæk Sepstrup ◽  
Flemming Helsing-Nielsen ◽  
Toke Thomas Høye

Insect monitoring methods are typically very time-consuming and involve substantial investment in species identification following manual trapping in the field. Insect traps are often only serviced weekly, resulting in low temporal resolution of the monitoring data, which hampers the ecological interpretation. This paper presents a portable computer vision system capable of attracting and detecting live insects. More specifically, the paper proposes detection and classification of species by recording images of live individuals attracted to a light trap. An Automated Moth Trap (AMT) with multiple light sources and a camera was designed to attract and monitor live insects during twilight and night hours. A computer vision algorithm referred to as Moth Classification and Counting (MCC), based on deep learning analysis of the captured images, tracked and counted the number of insects and identified moth species. Observations over 48 nights resulted in the capture of more than 250,000 images with an average of 5675 images per night. A customized convolutional neural network was trained on 2000 labeled images of live moths represented by eight different classes, achieving a high validation F1-score of 0.93. The algorithm measured an average classification and tracking F1-score of 0.71 and a tracking detection rate of 0.79. Overall, the proposed computer vision system and algorithm showed promising results as a low-cost solution for non-destructive and automatic monitoring of moths.


Sign in / Sign up

Export Citation Format

Share Document