Performance improvement of backpropagation algorithm by automatic activation function gain tuning using fuzzy logic

2003 ◽  
Vol 50 ◽  
pp. 439-460 ◽  
Author(s):  
Kihwan Eom ◽  
Kyungkwon Jung ◽  
Harsha Sirisena
2021 ◽  
Author(s):  
Alshimaa Hamdy ◽  
Tarek Abed Soliman ◽  
Mohamed Rihan ◽  
Moawad I. Dessouky

Abstract Beamforming design is a crucial stage in millimeter-wave systems with massive antenna arrays. We propose a deep learning network for the design of the precoder and combiner in hybrid architectures. The proposed network employs a parametric rectified linear unit (PReLU) activation function which improves model accuracy with almost no complexity cost compared to other functions. The proposed network accepts practical channel estimation input and can be trained to enhance spectral efficiency considering the hardware limitation of the hybrid design. Simulation shows that the proposed network achieves small performance improvement when compared to the same network with the ReLU activation function.


2021 ◽  
pp. 1-20
Author(s):  
Shao-Qun Zhang ◽  
Zhi-Hua Zhou

Abstract Current neural networks are mostly built on the MP model, which usually formulates the neuron as executing an activation function on the real-valued weighted aggregation of signals received from other neurons. This letter proposes the flexible transmitter (FT) model, a novel bio-plausible neuron model with flexible synaptic plasticity. The FT model employs a pair of parameters to model the neurotransmitters between neurons and puts up a neuron-exclusive variable to record the regulated neurotrophin density. Thus, the FT model can be formulated as a two-variable, two-valued function, taking the commonly used MP neuron model as its particular case. This modeling manner makes the FT model biologically more realistic and capable of handling complicated data, even spatiotemporal data. To exhibit its power and potential, we present the flexible transmitter network (FTNet), which is built on the most common fully connected feedforward architecture taking the FT model as the basic building block. FTNet allows gradient calculation and can be implemented by an improved backpropagation algorithm in the complex-valued domain. Experiments on a broad range of tasks show that FTNet has power and potential in processing spatiotemporal data. This study provides an alternative basic building block in neural networks and exhibits the feasibility of developing artificial neural networks with neuronal plasticity.


Sensors ◽  
2021 ◽  
Vol 21 (20) ◽  
pp. 6747
Author(s):  
Yang Liu ◽  
Jie Jiang ◽  
Jiahao Sun ◽  
Xianghan Wang

Hand pose estimation from RGB images has always been a difficult task, owing to the incompleteness of the depth information. Moon et al. improved the accuracy of hand pose estimation by using a new network, InterNet, through their unique design. Still, the network still has potential for improvement. Based on the architecture of MobileNet v3 and MoGA, we redesigned a feature extractor that introduced the latest achievements in the field of computer vision, such as the ACON activation function and the new attention mechanism module, etc. Using these modules effectively with our network, architecture can better extract global features from an RGB image of the hand, leading to a greater performance improvement compared to InterNet and other similar networks.


Author(s):  
Anjar Wanto ◽  
Agus Perdana Windarto ◽  
Dedy Hartama ◽  
Iin Parlina

Artificial Neural Network (ANN) is often used to solve forecasting cases. As in this study. The artificial neural network used is with backpropagation algorithm. The study focused on cases concerning overcrowding forecasting based District in Simalungun in Indonesia in 2010-2015. The data source comes from the Central Bureau of Statistics of Simalungun Regency. The population density forecasting its future will be processed using backpropagation algorithm focused on binary sigmoid function (logsig) and a linear function of identity (purelin) with 5 network architecture model used the 3-5-1, 3-10-1, 3-5 -10-1, 3-5-15-1 and 3-10-15-1. Results from 5 to architectural models using Neural Networks Backpropagation with binary sigmoid function and identity functions vary greatly, but the best is 3-5-1 models with an accuracy of 94%, MSE, and the epoch 0.0025448 6843 iterations. Thus, the use of binary sigmoid activation function (logsig) and the identity function (purelin) on Backpropagation Neural Networks for forecasting the population density is very good, as evidenced by the high accuracy results achieved.


Sign in / Sign up

Export Citation Format

Share Document