scholarly journals Methodology for Olive Pruning Windrow Assessment Using 3D Time-of-Flight Camera

Agronomy ◽  
2021 ◽  
Vol 11 (6) ◽  
pp. 1209
Author(s):  
Francisco J. Castillo-Ruiz ◽  
Jose T. Colmenero-Martinez ◽  
Sergio Bayano-Tejero ◽  
Emilio J. Gonzalez-Sanchez ◽  
Francisco M. Lara ◽  
...  

The management of olive pruning residue has shifted from burning to shredding, laying residues on soil, or harvesting residues for use as a derivative. The objective of this research is to develop, test, and validate a methodology to measure the dimensions, outline, and bulk volume of pruning residue windrows in olive orchards using both a manual and a 3D Time-of-Flight (ToF) camera. Trees were pruned using trunk shaker targeted pruning, from which two different branch sizes were selected to build two separate windrow treatments with the same pruning residue dose. Four windrows were built for each treatment, and four sampling points were selected along each windrow to take measurements using both manual and 3D ToF measurements. Windrow section outline could be defined using a polynomial or a triangular function, although manual measurement required processing with a polynomial function, especially for high windrow volumes. Different branch sizes provided to be significant differences for polynomial function coefficients, while no significant differences were found for windrow width. Bigger branches provided less bulk volume, which implied that these branches formed less porous windrows that smaller ones. Finally, manual and 3D ToF camera measurements were validated, giving an adequate performance for olive pruning residue windrow in-field assessment.

Proceedings ◽  
2018 ◽  
Vol 2 (13) ◽  
pp. 1056
Author(s):  
Marcus Baumgart ◽  
Norbert Druml ◽  
Markus Dielacher ◽  
Cristina Consani

Robust, fast and reliable examination of the surroundings is essential for further advancements in autonomous driving and robotics. Time-of-Flight (ToF) camera sensors are a key technology to measure surrounding objects and their distances on a pixel basis in real-time. Environmental effects, like rain in front of the sensor, can influence the distance accuracy of the sensor. Here we use an optical ray-tracing based procedure to examine the rain effect on the ToF image. Simulation results are presented for experimental rain droplet distributions, characteristic of intense rainfall at rates of 25 mm/h and 100 mm/h. The ray-tracing based simulation data and results serve as an input for developing and testing rain signal suppression strategies.


2020 ◽  
Vol 17 (4) ◽  
pp. 172988142094237
Author(s):  
Yu He ◽  
Shengyong Chen

The developing time-of-flight (TOF) camera is an attractive device for the robot vision system to capture real-time three-dimensional (3D) images, but the sensor suffers from the limit of low resolution and precision of images. This article proposes an approach to automatic generation of an imaging model in the 3D space for error correction. Through observation data, an initial coarse model of the depth image can be obtained for each TOF camera. Then, its accuracy is improved by an optimization method. Experiments are carried out using three TOF cameras. Results show that the accuracy is dramatically improved by the spatial correction model.


Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 664
Author(s):  
Zhihong Ma ◽  
Dawei Sun ◽  
Haixia Xu ◽  
Yueming Zhu ◽  
Yong He ◽  
...  

Three-dimensional (3D) structure is an important morphological trait of plants for describing their growth and biotic/abiotic stress responses. Various methods have been developed for obtaining 3D plant data, but the data quality and equipment costs are the main factors limiting their development. Here, we propose a method to improve the quality of 3D plant data using the time-of-flight (TOF) camera Kinect V2. A K-dimension (k-d) tree was applied to spatial topological relationships for searching points. Background noise points were then removed with a minimum oriented bounding box (MOBB) with a pass-through filter, while outliers and flying pixel points were removed based on viewpoints and surface normals. After being smoothed with the bilateral filter, the 3D plant data were registered and meshed. We adjusted the mesh patches to eliminate layered points. The results showed that the patches were closer. The average distance between the patches was 1.88 × 10−3 m, and the average angle was 17.64°, which were 54.97% and 48.33% of those values before optimization. The proposed method performed better in reducing noise and the local layered-points phenomenon, and it could help to more accurately determine 3D structure parameters from point clouds and mesh models.


Author(s):  
C. Altuntas ◽  
F. Turkmen ◽  
A. Ucar ◽  
Y. A. Akgul

Biomedical applications generally needs measurement the human body parts in motion. On the other hand, the analysis of the human motion includes mobile measurements. The mobile measurement is complicated task because it needs two or more sensor combination, specific measurement techniques and huge computation. Thus, it is actual research topic in photogrammetry and computer sciences community. Time-of-flight (ToF) camera can make measurement the moving object. It can be used for robotic and simultaneous localization and mapping applications. Human motion capture is recent application area for ToF camera. In this study analysis of the body motion were made with time-of-flight camera. We made measurement to runner on treadmill. The motion was analysed with computing the angle between body parts.


Author(s):  
C. Heinkelé ◽  
M. Labbé ◽  
V. Muzet ◽  
P. Charbonnier

3D-cameras based on Time-of-Flight (ToF) technology have recently raised up to a commercial level of development. In this contribution, we investigate the outdoor calibration and measurement capabilities of the SR4500 ToF camera. The proposed calibration method combines up-to-date techniques with robust estimation. First, intrinsic camera parameters are estimated, which allows converting radial distances into orthogonal ones. The latter are then calibrated using successive acquisitions of a plane at different camera positions, measured by tacheometric techniques. This distance calibration step estimates two coefficient matrices for each pixel, using linear regression. Experimental assessments carried out with a 3D laser-cloud after converting all the data in a common basis show that the obtained precision is twice better than with the constructor default calibration, with a full-frame accuracy of about 4 cm. Moreover, estimating the internal calibration in sunny and warm outdoor conditions yields almost the same coefficients as indoors. Finally, a test shows the feasibility of dynamic outdoor acquisitions and measurements.


Author(s):  
L. Hoegner ◽  
A. Hanel ◽  
M. Weinmann ◽  
B. Jutzi ◽  
S. Hinz ◽  
...  

Obtaining accurate 3d descriptions in the thermal infrared (TIR) is a quite challenging task due to the low geometric resolutions of TIR cameras and the low number of strong features in TIR images. Combining the radiometric information of the thermal infrared with 3d data from another sensor is able to overcome most of the limitations in the 3d geometric accuracy. In case of dynamic scenes with moving objects or a moving sensor system, a combination with RGB cameras of Time-of-Flight (TOF) cameras is suitable. As a TOF camera is an active sensor in the near infrared (NIR) and the thermal infrared camera captures the radiation emitted by the objects in the observed scene, the combination of these two sensors for close range applications is independent from external illumination or textures in the scene. This article is focused on the fusion of data acquired both with a time-of-flight (TOF) camera and a thermal infrared (TIR) camera. As the radiometric behaviour of many objects differs between the near infrared used by the TOF camera and the thermal infrared spectrum, a direct co-registration with feature points in both intensity images leads to a high number of outliers. A fully automatic workflow of the geometric calibration of both cameras and the relative orientation of the camera system with one calibration pattern usable for both spectral bands is presented. Based on the relative orientation, a fusion of the TOF depth image and the TIR image is used for scene segmentation and people detection. An adaptive histogram based depth level segmentation of the 3d point cloud is combined with a thermal intensity based segmentation. The feasibility of the proposed method is demonstrated in an experimental setup with different geometric and radiometric influences that show the benefit of the combination of TOF intensity and depth images and thermal infrared images.


Author(s):  
Sarah Morris ◽  
Ari Goldman ◽  
Brian Thurow

Time of Flight (ToF) cameras are a type of range-imaging camera that provides three-dimensional scene information from a single camera. This paper assesses the ability of ToF technology to be used for threedimensional particle tracking velocimetry (3D-PTV). Using a commercially available ToF camera various aspects of 3D-PTV are considered, including: minimum resolvable particle size, environmental factors (reflections and refractive index changes) and time resolution. Although it is found that an off-the-shelf ToF camera is not a viable alternative to traditional 3D-PTV measurement systems, basic 3D-PTV measurements are shown with large (6mm) particles in both air and water to demonstrate future potential use as this technology develops. A summary of necessary technological advances is also discussed.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 2959
Author(s):  
Johanna Gleichauf ◽  
Sven Herrmann ◽  
Lukas Hennemann ◽  
Hannes Krauss ◽  
Janina Nitschke ◽  
...  

This paper introduces an automatic non-contact monitoring method based on the synchronous evaluation of a 3D time-of-flight (ToF) camera and a microwave interferometric radar sensor for measuring the respiratory rate of neonates. The current monitoring on the Neonatal Intensive Care Unit (NICU) has several issues which can cause pressure marks, skin irritations and eczema. To minimize these risks, a non-contact system made up of a 3D time-of-flight camera and a microwave interferometric radar sensor is presented. The 3D time-of-flight camera delivers 3D point clouds which can be used to calculate the change in distance of the moving chest and from it the respiratory rate. The disadvantage of the ToF camera is that the heartbeat cannot be determined. The microwave interferometric radar sensor determines the change in displacement caused by the respiration and is even capable of measuring the small superimposed movements due to the heartbeat. The radar sensor is very sensitive towards movement artifacts due to, e.g., the baby moving its arms. To allow a robust vital parameter detection the data of both sensors was evaluated synchronously. In this publication, we focus on the first step: determining the respiratory rate. After all processing steps, the respiratory rate determined by the radar sensor was compared to the value received from the 3D time-of-flight camera. The method was validated against our gold standard: a self-developed neonatal simulation system which can simulate different breathing patterns. In this paper, we show that we are the first to determine the respiratory rate by evaluating the data of an interferometric microwave radar sensor and a ToF camera synchronously. Our system delivers very precise breaths per minute (BPM) values within the norm range of 20–60 BPM with a maximum difference of 3 BPM (for the ToF camera itself at 30 BPM in normal mode). Especially in lower respiratory rate regions, i.e., 5 and 10 BPM, the synchronous evaluation is required to compensate the drawbacks of the ToF camera. In the norm range, the ToF camera performs slightly better than the radar sensor.


Sign in / Sign up

Export Citation Format

Share Document