Comparative Analysis Between Recurrent Convolutional and Convolutional Neural Networks for Horizon Detection

2020 ◽  
Vol 17 (9) ◽  
pp. 4364-4367
Author(s):  
Shreya Srinarasi ◽  
Seema Jahagirdar ◽  
Charan Renganathan ◽  
H. Mallika

The preliminary step in the navigation of Unmanned Vehicles is to detect and identify the horizon line. One method to locate the horizon and obstacles in an image is through a supervised learning, semantic segmentation algorithm using Neural Networks. Unmanned Aerial Vehicles (UAVs) are rapidly gaining prominence in military, commercial and civilian applications. For the safe navigation of UAVs, there poses a requirement for an accurate and efficient obstacle detection and avoidance. The position of the horizon and obstacles can also be used for adjusting flight parameters and estimating altitude. It can also be used for the navigation of Unmanned Ground Vehicles (UGV), by neglecting the part of the image above the horizon to reduce the processing time. Locating the horizon and identifying the various obstacles in an image can help in minimizing collisions and high costs due to failure of UAVs and UGVs. To achieve a robust and accurate system to aid navigation of autonomous vehicles, the efficiency and accuracy of Convolutional Neural Networks (CNN) and Recurrent-CNNs (RCNN) are analysed. It is observed via experimentation that the RCNN model classifies test images with higher accuracy.

2021 ◽  
Vol 10 (6) ◽  
pp. 377
Author(s):  
Chiao-Ling Kuo ◽  
Ming-Hua Tsai

The importance of road characteristics has been highlighted, as road characteristics are fundamental structures established to support many transportation-relevant services. However, there is still huge room for improvement in terms of types and performance of road characteristics detection. With the advantage of geographically tiled maps with high update rates, remarkable accessibility, and increasing availability, this paper proposes a novel simple deep-learning-based approach, namely joint convolutional neural networks (CNNs) adopting adaptive squares with combination rules to detect road characteristics from roadmap tiles. The proposed joint CNNs are responsible for the foreground and background image classification and various types of road characteristics classification from previous foreground images, raising detection accuracy. The adaptive squares with combination rules help efficiently focus road characteristics, augmenting the ability to detect them and provide optimal detection results. Five types of road characteristics—crossroads, T-junctions, Y-junctions, corners, and curves—are exploited, and experimental results demonstrate successful outcomes with outstanding performance in reality. The information of exploited road characteristics with location and type is, thus, converted from human-readable to machine-readable, the results will benefit many applications like feature point reminders, road condition reports, or alert detection for users, drivers, and even autonomous vehicles. We believe this approach will also enable a new path for object detection and geospatial information extraction from valuable map tiles.


IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Vishal Singh ◽  
Pradeeba Sridar ◽  
Jinman Kim ◽  
Ralph Nanan ◽  
N. Poornima ◽  
...  

2021 ◽  
Vol 40 (1) ◽  
Author(s):  
David Müller ◽  
Andreas Ehlen ◽  
Bernd Valeske

AbstractConvolutional neural networks were used for multiclass segmentation in thermal infrared face analysis. The principle is based on existing image-to-image translation approaches, where each pixel in an image is assigned to a class label. We show that established networks architectures can be trained for the task of multiclass face analysis in thermal infrared. Created class annotations consisted of pixel-accurate locations of different face classes. Subsequently, the trained network can segment an acquired unknown infrared face image into the defined classes. Furthermore, face classification in live image acquisition is shown, in order to be able to display the relative temperature in real-time from the learned areas. This allows a pixel-accurate temperature face analysis e.g. for infection detection like Covid-19. At the same time our approach offers the advantage of concentrating on the relevant areas of the face. Areas of the face irrelevant for the relative temperature calculation or accessories such as glasses, masks and jewelry are not considered. A custom database was created to train the network. The results were quantitatively evaluated with the intersection over union (IoU) metric. The methodology shown can be transferred to similar problems for more quantitative thermography tasks like in materials characterization or quality control in production.


2018 ◽  
Vol 15 (2) ◽  
pp. 173-177 ◽  
Author(s):  
Kaiqiang Chen ◽  
Kun Fu ◽  
Menglong Yan ◽  
Xin Gao ◽  
Xian Sun ◽  
...  

Author(s):  
O.N. Korsun ◽  
V.N. Yurko

We analysed two approaches to estimating the state of a human operator according to video imaging of the face. These approaches, both using deep convolutional neural networks, are as follows: 1) automated emotion recognition; 2) analysis of blinking characteristics. The study involved assessing changes in the functional state of a human operator performing a manual landing in a flight simulator. During this process, flight parameters were recorded, and the operator’s face was filmed. Then we used our custom software to perform automated recognition of emotions (blinking), synchronising the emotions (blinking) recognised to the flight parameters recorded. As a result, we detected persistent patterns linking the operator fatigue level to the number of emotions recognised by the neural network. The type of emotion depends on unique psychological characteristics of the operator. Our experiments allow for easily tracing these links when analysing the emotions of "Sadness", "Fear" and "Anger". The study revealed a correlation between blinking properties and piloting accuracy. A higher piloting accuracy meant more blinks recorded, which may be explained by a stable psycho-physiological state leading to confident piloting


Sign in / Sign up

Export Citation Format

Share Document