optical flow sensor
Recently Published Documents


TOTAL DOCUMENTS

52
(FIVE YEARS 14)

H-INDEX

9
(FIVE YEARS 1)

2021 ◽  
Vol 11 (22) ◽  
pp. 10607
Author(s):  
Andrea Motroni ◽  
Alice Buffi ◽  
Paolo Nepa

This article addresses the problem of determining the location of pallets carried by forklifts inside a warehouse, which are recognized thanks to an onboard Radio Frequency IDentification (RFID) system at the ultra-high-frequency (UHF) band. By reconstructing the forklift trajectory and orientation, the location of the pallets can be associated with the forklift position at the time of unloading events. The localization task is accomplished by means of an easy-to-deploy combination of onboard sensors, i.e., an inertial measurement unit (IMU) and an optical flow sensor (OFS), with a commercial ultra-wideband (UWB) system through an Unscented Kalman Filter (UKF) algorithm, which estimates the forklift pose over time. The proposed sensor fusion approach contributes to the localization error mitigation by preventing drifts in the trajectory reconstruction. The designed methos was at first evaluated by means of a simulation framework and then through an experimental analysis conducted in a large warehouse with a size of about 4000 m2.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Xiaochun Guan ◽  
Sheng Lou ◽  
Han Li ◽  
Tinglong Tang

Purpose Deployment of deep neural networks on embedded devices is becoming increasingly popular because it can reduce latency and energy consumption for data communication. This paper aims to give out a method for deployment the deep neural networks on a quad-rotor aircraft for further expanding its application scope. Design/methodology/approach In this paper, a design scheme is proposed to implement the flight mission of the quad-rotor aircraft based on multi-sensor fusion. It integrates attitude acquisition module, global positioning system position acquisition module, optical flow sensor, ultrasonic sensor and Bluetooth communication module, etc. A 32-bit microcontroller is adopted as the main controller for the quad-rotor aircraft. To make the quad-rotor aircraft be more intelligent, the study also proposes a method to deploy the pre-trained deep neural networks model on the microcontroller based on the software packages of the RT-Thread internet of things operating system. Findings This design provides a simple and efficient design scheme to further integrate artificial intelligence (AI) algorithm for the control system design of quad-rotor aircraft. Originality/value This method provides an application example and a design reference for the implementation of AI algorithms on unmanned aerial vehicle or terminal robots.


2021 ◽  
Vol 17 (4) ◽  
pp. 155014772110098
Author(s):  
Xiaoqin Liu ◽  
Xiang Li ◽  
Qi Shi ◽  
Chuanpei Xu ◽  
Yanmei Tang

Three-dimensional attitude estimation for unmanned aerial vehicles is usually based on the combination of magnetometer, accelerometer, and gyroscope (MARG). But MARG sensor can be easily affected by various disturbances, for example, vibration, external magnetic interference, and gyro drift. Optical flow sensor has the ability to extract motion information from image sequence, and thus, it is potential to augment three-dimensional attitude estimation for unmanned aerial vehicles. But the major problem is that the optical flow can be caused by both translational and rotational movements, which are difficult to be distinguished from each other. To solve the above problems, this article uses a gated recurrent unit neural network to implement data fusion for MARG and optical flow sensors, so as to enhance the accuracy of three-dimensional attitude estimation for unmanned aerial vehicles. The proposed algorithm can effectively make use of the attitude information contained in the optical flow measurements and can also achieve multi-sensor fusion for attitude estimation without explicit mathematical model. Compared with the commonly used extended Kalman filter algorithm for attitude estimation, the proposed algorithm shows higher accuracy in the flight test of quad-rotor unmanned aerial vehicles.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1390
Author(s):  
Tomasz Ursel ◽  
Michał Olinski

This article aims to develop a system capable of estimating the displacement of a moving object with the usage of a relatively cheap and easy to apply sensors. There is a growing need for such systems, not only for robots, but also, for instance, pedestrian navigation. In this paper, the theory for this idea, including data postprocessing algorithms for a MEMS accelerometer and an optical flow sensor (OFS), as well as the developed complementary filter applied for sensor fusion, are presented. In addition, a vital part of the accelerometer’s algorithm, the zero velocity states detection, is implemented. It is based on analysis of the acceleration’s signal and further application of acceleration symmetrization, greatly improving the obtained displacement. A test stand with a linear guide and motor enabling imposing a specified linear motion is built. The results of both sensors’ testing suggest that the displacement estimated by each of them is highly correct. Fusion of the sensors’ data gives even better outcomes, especially in cases with external disturbance of OFS. The comparative evaluation of estimated linear displacements, in each case related to encoder data, confirms the algorithms’ operation correctness and proves the chosen sensors’ usefulness in the development of a linear displacement measuring system.


Sensors ◽  
2020 ◽  
Vol 20 (16) ◽  
pp. 4411 ◽  
Author(s):  
Hongyan Tang ◽  
Dan Zhang ◽  
Zhongxue Gan

Vertical take-off and landing unmanned aerial vehicles (VTOL UAV) are widely used in various fields because of their stable flight, easy operation, and low requirements for take-off and landing environments. To further expand the UAV’s take-off and landing environment to include a non-structural complex environment, this study developed a landing gear robot for VTOL vehicles. This article mainly introduces the adaptive landing control of the landing gear robot in an unstructured environment. Based on the depth camera (TOF camera), IMU, and optical flow sensor, the control system achieves multi-sensor data fusion and uses a robotic kinematical model to achieve adaptive landing. Finally, this study verifies the feasibility and effectiveness of adaptive landing through experiments.


Sign in / Sign up

Export Citation Format

Share Document