Comprehensive Overview of Neural Networks and Its Applications in Autonomous Vehicles

Author(s):  
Jay Rodge ◽  
Swati Jaiswal

Deep learning and Artificial intelligence (AI) have been trending these days due to the capability and state-of-the-art results that they provide. They have replaced some highly skilled professionals with neural network-powered AI, also known as deep learning algorithms. Deep learning majorly works on neural networks. This chapter discusses about the working of a neuron, which is a unit component of neural network. There are numerous techniques that can be incorporated while designing a neural network, such as activation functions, training, etc. to improve its features, which will be explained in detail. It has some challenges such as overfitting, which are difficult to neglect but can be overcome using proper techniques and steps that have been discussed. The chapter will help the academician, researchers, and practitioners to further investigate the associated area of deep learning and its applications in the autonomous vehicle industry.

2020 ◽  
Vol 2 ◽  
pp. 58-61 ◽  
Author(s):  
Syed Junaid ◽  
Asad Saeed ◽  
Zeili Yang ◽  
Thomas Micic ◽  
Rajesh Botchu

The advances in deep learning algorithms, exponential computing power, and availability of digital patient data like never before have led to the wave of interest and investment in artificial intelligence in health care. No radiology conference is complete without a substantial dedication to AI. Many radiology departments are keen to get involved but are unsure of where and how to begin. This short article provides a simple road map to aid departments to get involved with the technology, demystify key concepts, and pique an interest in the field. We have broken down the journey into seven steps; problem, team, data, kit, neural network, validation, and governance.


2021 ◽  
Vol 336 ◽  
pp. 07004
Author(s):  
Ruoyu Fang ◽  
Cheng Cai

Obstacle detection and target tracking are two major issues for intelligent autonomous vehicles. This paper proposes a new scheme to achieve target tracking and real-time obstacle detection of obstacles based on computer vision. ResNet-18 deep learning neural network is utilized for obstacle detection and Yolo-v3 deep learning neural network is employed for real-time target tracking. These two trained models can be deployed on an autonomous vehicle equipped with an NVIDIA Jetson Nano motherboard. The autonomous vehicle moves to avoid obstacles and follow tracked targets by camera. Adjusting the steering and movement of the autonomous vehicle according to the PID algorithm during the movement, therefore, will help the proposed vehicle achieve stable and precise tracking.


Author(s):  
M Venkata Krishna Reddy* ◽  
Pradeep S.

1. Bilal, A. Jourabloo, M. Ye, X. Liu, and L. Ren. Do Convolutional Neural Networks Learn Class Hierarchy? IEEE Transactions on Visualization and Computer Graphics, 24(1):152–162, Jan. 2018. 2. M. Carney, B. Webster, I. Alvarado, K. Phillips, N. Howell, J. Griffith, J. Jongejan, A. Pitaru, and A. Chen. Teachable Machine: Approachable Web-Based Tool for Exploring Machine Learning Classification. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, CHI ’20. ACM, Honolulu, HI, USA, 2020. 3. A. Karpathy. CS231n Convolutional Neural Networks for Visual Recognition, 2016 4. M. Kahng, N. Thorat, D. H. Chau, F. B. Viegas, and M. Wattenberg. GANLab: Understanding Complex Deep Generative Models using Interactive Visual Experimentation. IEEE Transactions on Visualization and Computer Graphics, 25(1):310–320, Jan. 2019. 5. J. Yosinski, J. Clune, A. Nguyen, T. Fuchs, and H. Lipson. Understanding Neural Networks Through Deep Visualization. In ICML Deep Learning Workshop, 2015 6. M. Kahng, P. Y. Andrews, A. Kalro, and D. H. Chau. ActiVis: Visual Exploration of Industry-Scale Deep Neural Network Models. IEEE Transactions on Visualization and Computer Graphics, 24(1):88–97, Jan. 2018. 7. https://cs231n.github.io/convolutional-networks/ 8. https://www.analyticsvidhya.com/blog/2020/02/learn-imageclassification-cnn-convolutional-neural-networks-3-datasets/ 9. https://towardsdatascience.com/understanding-cnn-convolutionalneural- network-69fd626ee7d4 10. https://medium.com/@birdortyedi_23820/deep-learning-lab-episode-2- cifar- 10-631aea84f11e 11. J. Gu, Z. Wang, J. Kuen, L. Ma, A. Shahroudy, B. Shuai, T. Liu, X. Wang, G. Wang, J. Cai, and T. Chen. Recent advances in convolutional neural networks. Pattern Recognition, 77:354–377, May 2018. 12. Hamid, Y., Shah, F.A. and Sugumaram, M. (2014), ―Wavelet neural network model for network intrusion detection system‖, International Journal of Information Technology, Vol. 11 No. 2, pp. 251-263 13. G Sreeram , S Pradeep, K SrinivasRao , B.Deevan Raju , Parveen Nikhat , ― Moving ridge neuronal espionage network simulation for reticulum invasion sensing‖. International Journal of Pervasive Computing and Communications.https://doi.org/10.1108/IJPCC-05- 2020-0036 14. E. Stevens, L. Antiga, and T. Viehmann. Deep Learning with PyTorch. O’Reilly Media, 2019. 15. J. Yosinski, J. Clune, A. Nguyen, T. Fuchs, and H. Lipson. Understanding Neural Networks Through Deep Visualization. In ICML Deep Learning Workshop, 2015. 16. Aman Dureja, Payal Pahwa, ―Analysis of Non-Linear Activation Functions for Classification Tasks Using Convolutional Neural Networks‖, Recent Advances in Computer Science , Vol 2, Issue 3, 2019 ,PP-156-161 17. https://missinglink.ai/guides/neural-network-concepts/7-types-neuralnetwork-activation-functions-right/


Author(s):  
Rui Li ◽  
Weitian Wang ◽  
Yi Chen ◽  
Srivatsan Srinivasan ◽  
Venkat N. Krovi

Fully automatic parking (FAP) is a key step towards the age of autonomous vehicle. Motivated by the contribution of human vision to human parking, in this paper, we propose a computer vision based FAP method for the autonomous vehicles. Based on the input images from a rear camera on the vehicle, a convolutional neural network (CNN) is trained to automatically output the steering and velocity commands for the vehicle controlling. The CNN is trained by Caffe deep learning framework. A 1/10th autonomous vehicle research platform (1/10-SAVRP), which configured with a vehicle controller unit, an automated driving processor, and a rear camera, is used for demonstrating the parking maneuver. The experimental results suggested that the proposed approach enabled the vehicle to gain the ability of parking independently without human input in different driving settings.


2019 ◽  
Vol 1 (1) ◽  
pp. p8
Author(s):  
Jamilu Auwalu Adamu

One of the objectives of this paper is to incorporate fat-tail effects into, for instance, Sigmoid in order to introduce Transparency and Stability into the existing stochastic Activation Functions. Secondly, according to the available literature reviewed, the existing set of Activation Functions were introduced into the Deep learning Artificial Neural Network through the “Window” not properly through the “Legitimate Door” since they are “Trial and Error “and “Arbitrary Assumptions”, thus, the Author proposed a “Scientific Facts”, “Definite Rules: Jameel’s Stochastic ANNAF Criterion”, and a “Lemma” to substitute not necessarily replace the existing set of stochastic Activation Functions, for instance, the Sigmoid among others. This research is expected to open the “Black-Box” of Deep Learning Artificial Neural networks. The author proposed a new set of advanced optimized fat-tailed Stochastic Activation Functions EMANATED from the AI-ML-Purified Stocks Data  namely; the Log – Logistic (3P) Probability Distribution (1st), Cauchy Probability Distribution (2nd), Pearson 5 (3P) Probability Distribution (3rd), Burr (4P) Probability Distribution (4th), Fatigue Life (3P) Probability Distribution (5th), Inv. Gaussian (3P) Probability Distribution (6th), Dagum (4P) Probability Distribution (7th), and Lognormal (3P) Probability Distribution (8th) for the successful conduct of both Forward and Backward Propagations of Deep Learning Artificial Neural Network. However, this paper did not check the Monotone Differentiability of the proposed distributions. Appendix A, B, and C presented and tested the performances of the stressed Sigmoid and the Optimized Activation Functions using Stocks Data (2014-1991) of Microsoft Corporation (MSFT), Exxon Mobil (XOM), Chevron Corporation (CVX), Honda Motor Corporation (HMC), General Electric (GE), and U.S. Fundamental Macroeconomic Parameters, the results were found fascinating. Thus, guarantee, the first three distributions are excellent Activation Functions to successfully conduct any Stock Deep Learning Artificial Neural Network. Distributions Number 4 to 8 are also good Advanced Optimized Activation Functions. Generally, this research revealed that the Advanced Optimized Activation Functions satisfied Jameel’s ANNAF Stochastic Criterion depends on the Referenced Purified AI Data Set, Time Change and Area of Application which is against the existing “Trial and Error “and “Arbitrary Assumptions” of Sigmoid, Tanh, Softmax, ReLu, and Leaky ReLu.


2021 ◽  
pp. 385-399
Author(s):  
Wilson Guasti Junior ◽  
Isaac P. Santos

Abstract In this work we explore the use of deep learning models based on deep feedforward neural networks to solve ordinary and partial differential equations. The illustration of this methodology is given by solving a variety of initial and boundary value problems. The numerical results, obtained based on different feedforward neural networks structures, activation functions and minimization methods, were compared to each other and to the exact solutions. The neural network was implemented using the Python language, with the Tensorflow library.


2022 ◽  
Author(s):  
Isaac Ronald Ward ◽  
Jack Joyner ◽  
Casey Lickfold ◽  
Yulan Guo ◽  
Mohammed Bennamoun

Graph neural networks (GNNs) have recently grown in popularity in the field of artificial intelligence (AI) due to their unique ability to ingest relatively unstructured data types as input data. Although some elements of the GNN architecture are conceptually similar in operation to traditional neural networks (and neural network variants), other elements represent a departure from traditional deep learning techniques. This tutorial exposes the power and novelty of GNNs to AI practitioners by collating and presenting details regarding the motivations, concepts, mathematics, and applications of the most common and performant variants of GNNs. Importantly, we present this tutorial concisely, alongside practical examples, thus providing a practical and accessible tutorial on the topic of GNNs.


2020 ◽  
Vol 69 (1) ◽  
pp. 378-383
Author(s):  
T.A. Nurmukhanov ◽  
◽  
B.S. Daribayev ◽  

Using neural networks, various variations of the classification of objects can be performed. Neural networks are used in many areas of recognition. A big area in this area is text recognition. The paper considers the optimal way to build a network for text recognition, the use of optimal methods for activation functions, and optimizers. Also, the article checked the correctness of text recognition with different optimization methods. This article is devoted to the analysis of convolutional neural networks. In the article, a convolutional neural network model will be trained with a teacher. Teaching with a teacher is a type of training for neural networks in which you provide the input data and the desired result, that is, the student looking at the input data will understand that you need to strive for the result that was provided to him.


Author(s):  
Sindhu P. Menon

In the last couple of years, artificial neural networks have gained considerable momentum. Their results could be enhanced if the number of layers could be made deeper. Of late, a lot of data has been generated, which has led to big data. This comes along with many challenges like quality, which is one of the most important ones. Deep learning models can improve the quality of data. In this chapter, an attempt has been made to review deep supervised and deep unsupervised learning algorithms and the various activation functions used. Challenges in deep learning have also been discussed.


Artnodes ◽  
2020 ◽  
Author(s):  
Bruno Caldas Vianna

This article uses the exhibition “Infinite Skulls”, which happened in Paris in the beginning of 2019, as a starting point to discuss art created by artificial intelligence and, by extension, unique pieces of art generated by algorithms. We detail the development of DCGAN, the deep learning neural network used in the show, from its cybernetics origin. The show and its creation process are described, identifying elements of creativity and technique, as well as question of the authorship of works. Then it frames these works in the context of generative art, pointing affinities and differences, and the issues of representing through procedures and abstractions. It describes the major breakthrough of neural network for technical images as the ability to represent categories through an abstraction, rather than images themselves. Finally, it tries to understand neural networks more as a tool for artists than an autonomous art creator.


Sign in / Sign up

Export Citation Format

Share Document