An autonomous competitive learning algorithm using quantum hamming neural networks

Author(s):  
Mohammed Zidan ◽  
Alaa Sagheer ◽  
Nasser Metwally
2020 ◽  
Vol 14 (1) ◽  
pp. 48-54
Author(s):  
D. Ostrenko ◽  

Emergency modes in electrical networks, arising for various reasons, lead to a break in the transmission of electrical energy on the way from the generating facility to the consumer. In most cases, such time breaks are unacceptable (the degree depends on the class of the consumer). Therefore, an effective solution is to both deal with the consequences, use emergency input of the reserve, and prevent these emergency situations by predicting events in the electric network. After analyzing the source [1], it was concluded that there are several methods for performing the forecast of emergency situations in electric networks. It can be: technical analysis, operational data processing (or online analytical processing), nonlinear regression methods. However, it is neural networks that have received the greatest application for solving these tasks. In this paper, we analyze existing neural networks used to predict processes in electrical systems, analyze the learning algorithm, and propose a new method for using neural networks to predict in electrical networks. Prognostication in electrical engineering plays a key role in shaping the balance of electricity in the grid, influencing the choice of mode parameters and estimated electrical loads. The balance of generation of electricity is the basis of technological stability of the energy system, its violation affects the quality of electricity (there are frequency and voltage jumps in the network), which reduces the efficiency of the equipment. Also, the correct forecast allows to ensure the optimal load distribution between the objects of the grid. According to the experience of [2], different methods are usually used for forecasting electricity consumption and building customer profiles, usually based on the analysis of the time dynamics of electricity consumption and its factors, the identification of statistical relationships between features and the construction of models.


2012 ◽  
Vol 3 (3) ◽  
pp. 179-188 ◽  
Author(s):  
Sevil Ahmed ◽  
Nikola Shakev ◽  
Andon Topalov ◽  
Kostadin Shiev ◽  
Okyay Kaynak

2021 ◽  
pp. 1-13
Author(s):  
Qiugang Zhan ◽  
Guisong Liu ◽  
Xiurui Xie ◽  
Guolin Sun ◽  
Huajin Tang

1991 ◽  
Vol 3 (4) ◽  
pp. 579-588 ◽  
Author(s):  
Chris Bishop

An important feature of radial basis function neural networks is the existence of a fast, linear learning algorithm in a network capable of representing complex nonlinear mappings. Satisfactory generalization in these networks requires that the network mapping be sufficiently smooth. We show that a modification to the error functional allows smoothing to be introduced explicitly without significantly affecting the speed of training. A simple example is used to demonstrate the resulting improvement in the generalization properties of the network.


Sign in / Sign up

Export Citation Format

Share Document