scholarly journals Spark Parallel Optimization Algorithm Based on Improved BP Neural Network

2020 ◽  
Vol 1550 ◽  
pp. 032044
Author(s):  
Yaru Ge
Author(s):  
Chunzhi Wang ◽  
Min Li ◽  
Ruoxi Wang ◽  
Han Yu ◽  
Shuping Wang

AbstractAs an important part of smart city construction, traffic image denoising has been studied widely. Image denoising technique can enhance the performance of segmentation and recognition model and improve the accuracy of segmentation and recognition results. However, due to the different types of noise and the degree of noise pollution, the traditional image denoising methods generally have some problems, such as blurred edges and details, loss of image information. This paper presents an image denoising method based on BP neural network optimized by improved whale optimization algorithm. Firstly, the nonlinear convergence factor and adaptive weight coefficient are introduced into the algorithm to improve the optimization ability and convergence characteristics of the standard whale optimization algorithm. Then, the improved whale optimization algorithm is used to optimize the initial weight and threshold value of BP neural network to overcome the dependence in the construction process, and shorten the training time of the neural network. Finally, the optimized BP neural network is applied to benchmark image denoising and traffic image denoising. The experimental results show that compared with the traditional denoising methods such as Median filtering, Neighborhood average filtering and Wiener filtering, the proposed method has better performance in peak signal-to-noise ratio.


2013 ◽  
Vol 765-767 ◽  
pp. 2805-2808
Author(s):  
Guo Wen Wang ◽  
Shi Xin Luo ◽  
Li He ◽  
Gang Yin

According to the question that BP Neural Network has slow velocity of convergence and is apt to fall into the minimum value, chaos thought is adopted in the particle swarm optimization (PSO). For this, chaos particle swarm optimization algorithm, which improve the ability of getting rid of fractional extreme point in the PSO, is presented and applied to the BP network exercise so that the calculation accuracy and velocity of convergence of BP network are increased. The method of training the BP network for speaker recognition, the recognition rate and speed of training have been greatly improved, making the speaker recognition based on BP neural network to get better results.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Peng Qin ◽  
Hongping Hu ◽  
Zhengmin Yang

AbstractGrasshopper optimization algorithm (GOA) proposed in 2017 mimics the behavior of grasshopper swarms in nature for solving optimization problems. In the basic GOA, the influence of the gravity force on the updated position of every grasshopper is not considered, which possibly causes GOA to have the slower convergence speed. Based on this, the improved GOA (IGOA) is obtained by the two updated ways of the position of every grasshopper in this paper. One is that the gravity force is introduced into the updated position of every grasshopper in the basic GOA. And the other is that the velocity is introduced into the updated position of every grasshopper and the new position are obtained from the sum of the current position and the velocity. Then every grasshopper adopts its suitable way of the updated position on the basis of the probability. Finally, IGOA is firstly performed on the 23 classical benchmark functions and then is combined with BP neural network to establish the predicted model IGOA-BPNN by optimizing the parameters of BP neural network for predicting the closing prices of the Shanghai Stock Exchange Index and the air quality index (AQI) of Taiyuan, Shanxi Province. The experimental results show that IGOA is superior to the compared algorithms in term of the average values and the predicted model IGOA-BPNN has the minimal predicted errors. Therefore, the proposed IGOA is an effective and efficient algorithm for optimization.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Bingsheng Chen ◽  
Huijie Chen ◽  
Mengshan Li

Feature selection can classify the data with irrelevant features and improve the accuracy of data classification in pattern classification. At present, back propagation (BP) neural network and particle swarm optimization algorithm can be well combined with feature selection. On this basis, this paper adds interference factors to BP neural network and particle swarm optimization algorithm to improve the accuracy and practicability of feature selection. This paper summarizes the basic methods and requirements for feature selection and combines the benefits of global optimization with the feedback mechanism of BP neural networks to feature based on backpropagation and particle swarm optimization (BP-PSO). Firstly, a chaotic model is introduced to increase the diversity of particles in the initial process of particle swarm optimization, and an adaptive factor is introduced to enhance the global search ability of the algorithm. Then, the number of features is optimized to reduce the number of features on the basis of ensuring the accuracy of feature selection. Finally, different data sets are introduced to test the accuracy of feature selection, and the evaluation mechanisms of encapsulation mode and filtering mode are used to verify the practicability of the model. The results show that the average accuracy of BP-PSO is 8.65% higher than the suboptimal NDFs model in different data sets, and the performance of BP-PSO is 2.31% to 18.62% higher than the benchmark method in all data sets. It shows that BP-PSO can select more distinguishing feature subsets, which verifies the accuracy and practicability of this model.


Sign in / Sign up

Export Citation Format

Share Document