Neuro-inspired system for real-time vision sensor tilt correction

Author(s):  
A. Jimenez-Fernandez ◽  
J.L. Fuentes-del-Bosh ◽  
R. Paz-Vicente ◽  
A. Linares-Barranco ◽  
G. Jimenez
Sensors ◽  
2019 ◽  
Vol 19 (7) ◽  
pp. 1584 ◽  
Author(s):  
Yushan Li ◽  
Wenbo Zhang ◽  
Xuewu Ji ◽  
Chuanxiang Ren ◽  
Jian Wu

The curvature of the lane output by the vision sensor caused by shadows, changes in lighting and line breaking jumps over in a period of time, which leads to serious problems for unmanned driving control. It is particularly important to predict or compensate the real lane in real-time during sensor jumps. This paper presents a lane compensation method based on multi-sensor fusion of global positioning system (GPS), inertial measurement unit (IMU) and vision sensors. In order to compensate the lane, the cubic polynomial function of the longitudinal distance is selected as the lane model. In this method, a Kalman filter is used to estimate vehicle velocity and yaw angle by GPS and IMU measurements, and a vehicle kinematics model is established to describe vehicle motion. It uses the geometric relationship between vehicle and relative lane motion at the current moment to solve the coefficient of the lane polynomial at the next moment. The simulation and vehicle test results show that the prediction information can compensate for the failure of the vision sensor, and has good real-time, robustness and accuracy.


2018 ◽  
Vol 10 (12) ◽  
pp. 2068 ◽  
Author(s):  
Juha Suomalainen ◽  
Teemu Hakala ◽  
Raquel Alves de Oliveira ◽  
Lauri Markelin ◽  
Niko Viljanen ◽  
...  

In unstable atmospheric conditions, using on-board irradiance sensors is one of the only robust methods to convert unmanned aerial vehicle (UAV)-based optical remote sensing data to reflectance factors. Normally, such sensors experience significant errors due to tilting of the UAV, if not installed on a stabilizing gimbal. Unfortunately, such gimbals of sufficient accuracy are heavy, cumbersome, and cannot be installed on all UAV platforms. In this paper, we present the FGI Aerial Image Reference System (FGI AIRS) developed at the Finnish Geospatial Research Institute (FGI) and a novel method for optical and mathematical tilt correction of the irradiance measurements. The FGI AIRS is a sensor unit for UAVs that provides the irradiance spectrum, Real Time Kinematic (RTK)/Post Processed Kinematic (PPK) GNSS position, and orientation for the attached cameras. The FGI AIRS processes the reference data in real time for each acquired image and can send it to an on-board or on-cloud processing unit. The novel correction method is based on three RGB photodiodes that are tilted 10° in opposite directions. These photodiodes sample the irradiance readings at different sensor tilts, from which reading of a virtual horizontal irradiance sensor is calculated. The FGI AIRS was tested, and the method was shown to allow on-board measurement of irradiance at an accuracy better than ±0.8% at UAV tilts up to 10° and ±1.2% at tilts up to 15°. In addition, the accuracy of FGI AIRS to produce reflectance-factor-calibrated aerial images was compared against the traditional methods. In the unstable weather conditions of the experiment, both the FGI AIRS and the on-ground spectrometer were able to produce radiometrically accurate and visually pleasing orthomosaics, while the reflectance reference panels and the on-board irradiance sensor without stabilization or tilt correction both failed to do so. The authors recommend the implementation of the proposed tilt correction method in all future UAV irradiance sensors if they are not to be installed on a gimbal.


2016 ◽  
Vol 2 (4) ◽  
pp. 187-195
Author(s):  
Ehab H. El-Shazly ◽  
Moataz M. Abdelwahab ◽  
Atsushi Shimada ◽  
Rin-ichiro Taniguchi

1993 ◽  
Vol 5 (2) ◽  
pp. 112-116
Author(s):  
Osamu Ozeki ◽  
◽  
Kouichi Kogure ◽  
Hiroyuki Onouchi ◽  
Hideo Abe Kazunori Higuchi ◽  
...  

A 3-D vision sensor and a visual inspection system for welded beads of automotive panels using this sensor have been developed. The items to be inspected are the depth, width, flushness, and inclination angle of the welded point. The 3-D vision sensor detects the section line of the welded bead using slit light method in real time, and the inspection system calculates the four inspection items from the section line. The measurement accuracy levels of this system are ±0.02mm for the depth and flushness, ±0.01mm for the width, and ±7 deg. for the inclination angle. Moreover, the system is able to inspect a single welded bead in 150ms. This system has been used particularly in an automotive factory.


2008 ◽  
Vol 20 (1) ◽  
pp. 68-74 ◽  
Author(s):  
Hirotsugu Okuno ◽  
◽  
Tetsuya Yagi

A mixed analog-digital integrated vision sensor was designed to detect an approaching object in real-time. To respond selectively to approaching stimuli, the sensor employed an algorithm inspired by the visual nervous system of a locust, which can avoid collisions robustly by using visual information. An electronic circuit model was designed to mimic the architecture of the locust nervous system. Computer simulations showed that the model provided appropriate responses for collision avoidance. We implemented the model with a compact hardware system consisting of a silicon retina and field-programmable gate array (FPGA) circuits; the system was confirmed to respond selectively to approaching stimuli that constituted a collision threat.


1995 ◽  
Vol 117 (1) ◽  
pp. 94-101
Author(s):  
J. Bassett ◽  
G. Walker

A vision sensor has been developed that uses only two lenses, a split prism, and a detector to acquire an image. This system uses the split prism to create a split image such that the displacement of the image is proportional to its range from the sensor. Prototype sensors have been examined both theoretically and experimentally, and have been found to measure object ranges with less than ±2 percent error. Acquisition of a single-point depth measurement is sufficiently fast for real-time use, and the optical components needed to build the sensor are inexpensive. The effect that each optical component has on the performance of the sensor is also discussed, and an optimal system design procedure is developed.


Sign in / Sign up

Export Citation Format

Share Document