Structuring of Merged Robot Vision and Automated Visual Inspection Tasks

2000 ◽  
Vol 33 (27) ◽  
pp. 219-224
Author(s):  
Theodor Borangiu ◽  
Nick Ivanescu
Electronics ◽  
2021 ◽  
Vol 10 (13) ◽  
pp. 1511
Author(s):  
Taylor Simons ◽  
Dah-Jye Lee

There has been a recent surge in publications related to binarized neural networks (BNNs), which use binary values to represent both the weights and activations in deep neural networks (DNNs). Due to the bitwise nature of BNNs, there have been many efforts to implement BNNs on ASICs and FPGAs. While BNNs are excellent candidates for these kinds of resource-limited systems, most implementations still require very large FPGAs or CPU-FPGA co-processing systems. Our work focuses on reducing the computational cost of BNNs even further, making them more efficient to implement on FPGAs. We target embedded visual inspection tasks, like quality inspection sorting on manufactured parts and agricultural produce sorting. We propose a new binarized convolutional layer, called the neural jet features layer, that learns well-known classic computer vision kernels that are efficient to calculate as a group. We show that on visual inspection tasks, neural jet features perform comparably to standard BNN convolutional layers while using less computational resources. We also show that neural jet features tend to be more stable than BNN convolution layers when training small models.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1385
Author(s):  
Yurong Feng ◽  
Kwaiwa Tse ◽  
Shengyang Chen ◽  
Chih-Yung Wen ◽  
Boyang Li

The inspection of electrical and mechanical (E&M) devices using unmanned aerial vehicles (UAVs) has become an increasingly popular choice in the last decade due to their flexibility and mobility. UAVs have the potential to reduce human involvement in visual inspection tasks, which could increase efficiency and reduce risks. This paper presents a UAV system for autonomously performing E&M device inspection. The proposed system relies on learning-based detection for perception, multi-sensor fusion for localization, and path planning for fully autonomous inspection. The perception method utilizes semantic and spatial information generated by a 2-D object detector. The information is then fused with depth measurements for object state estimation. No prior knowledge about the location and category of the target device is needed. The system design is validated by flight experiments using a quadrotor platform. The result shows that the proposed UAV system enables the inspection mission autonomously and ensures a stable and collision-free flight.


2015 ◽  
Author(s):  
Igor Jovančević ◽  
Jean-José Orteu ◽  
Thierry Sentenac ◽  
Rémi Gilblas

2020 ◽  
Vol 9 (1) ◽  
pp. 121-128
Author(s):  
Nur Dalila Abdullah ◽  
Ummi Raba'ah Hashim ◽  
Sabrina Ahmad ◽  
Lizawati Salahuddin

Selecting important features in classifying wood defects remains a challenging issue to the automated visual inspection domain. This study aims to address the extraction and analysis of features based on statistical texture on images of wood defects. A series of procedures including feature extraction using the Grey Level Dependence Matrix (GLDM) and feature analysis were executed in order to investigate the appropriate displacement and quantisation parameters that could significantly classify wood defects. Samples were taken from the KembangSemangkuk (KSK), Meranti and Merbau wood species. Findings from visual analysis and classification accuracy measures suggest that the feature set with the displacement parameter, d=2, and quantisation level, q=128, shows the highest classification accuracy. However, to achieve less computational cost, the feature set with quantisation level, q=32, shows acceptable performance in terms of classification accuracy.


1990 ◽  
Author(s):  
P. COLEMAN ◽  
S. NELSON ◽  
J. MARAM ◽  
A. NORMAN

Sign in / Sign up

Export Citation Format

Share Document