Bayesian Neural Networks of Probabilistic Back Propagation for Scalable Learning on Hyper-Parameters

Author(s):  
K. Thirupal Reddy ◽  
T. Swarnalatha
Author(s):  
Sherif S. Ishak ◽  
Haitham M. Al-Deek

Pattern recognition techniques such as artificial neural networks continue to offer potential solutions to many of the existing problems associated with freeway incident-detection algorithms. This study focuses on the application of Fuzzy ART neural networks to incident detection on freeways. Unlike back-propagation models, Fuzzy ART is capable of fast, stable learning of recognition categories. It is an incremental approach that has the potential for on-line implementation. Fuzzy ART is trained with traffic patterns that are represented by 30-s loop-detector data of occupancy, speed, or a combination of both. Traffic patterns observed at the incident time and location are mapped to a group of categories. Each incident category maps incidents with similar traffic pattern characteristics, which are affected by the type and severity of the incident and the prevailing traffic conditions. Detection rate and false alarm rate are used to measure the performance of the Fuzzy ART algorithm. To reduce the false alarm rate that results from occasional misclassification of traffic patterns, a persistence time period of 3 min was arbitrarily selected. The algorithm performance improves when the temporal size of traffic patterns increases from one to two 30-s periods for all traffic parameters. An interesting finding is that the speed patterns produced better results than did the occupancy patterns. However, when combined, occupancy–speed patterns produced the best results. When compared with California algorithms 7 and 8, the Fuzzy ART model produced better performance.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Florian Stelzer ◽  
André Röhm ◽  
Raul Vicente ◽  
Ingo Fischer ◽  
Serhiy Yanchuk

AbstractDeep neural networks are among the most widely applied machine learning tools showing outstanding performance in a broad range of tasks. We present a method for folding a deep neural network of arbitrary size into a single neuron with multiple time-delayed feedback loops. This single-neuron deep neural network comprises only a single nonlinearity and appropriately adjusted modulations of the feedback signals. The network states emerge in time as a temporal unfolding of the neuron’s dynamics. By adjusting the feedback-modulation within the loops, we adapt the network’s connection weights. These connection weights are determined via a back-propagation algorithm, where both the delay-induced and local network connections must be taken into account. Our approach can fully represent standard Deep Neural Networks (DNN), encompasses sparse DNNs, and extends the DNN concept toward dynamical systems implementations. The new method, which we call Folded-in-time DNN (Fit-DNN), exhibits promising performance in a set of benchmark tasks.


Author(s):  
Cosimo Magazzino ◽  
Marco Mele

AbstractThis paper shows that the co-movement of public revenues in the European Monetary Union (EMU) is driven by an unobserved common factor. Our empirical analysis uses yearly data covering the period 1970–2014 for 12 selected EMU member countries. We have found that this common component has a significant impact on public revenues in the majority of the countries. We highlight this common pattern in a dynamic factor model (DFM). Since this factor is unobservable, it is difficult to agree on what it represents. We argue that the latent factor that emerges from the two different empirical approaches used might have a composite nature, being the result of both the more general convergence of the economic cycles of the countries in the area and the increasingly better tuned tax structure. However, the original aspect of our paper is the use of a back-propagation neural networks (BPNN)-DF model to test the results of the time-series. At the level of computer programming, the results obtained represent the first empirical demonstration of the latent factor’s presence.


Sign in / Sign up

Export Citation Format

Share Document