scholarly journals Some Insights into the Geometry and Training of Neural Networks

Keyword(s):  
2018 ◽  
Vol 2018 ◽  
pp. 1-16 ◽  
Author(s):  
Aref M. al-Swaidani ◽  
Waed T. Khwies

Numerous volcanic scoria (VS) cones are found in many places worldwide. Many of them have not yet been investigated, although few of which have been used as a supplementary cementitious material (SCM) for a long time. The use of natural pozzolans as cement replacement could be considered as a common practice in the construction industry due to the related economic, ecologic, and performance benefits. In the current paper, the effect of VS on the properties of concrete was investigated. Twenty-one concrete mixes with three w/b ratios (0.5, 0.6, and 0.7) and seven replacement levels of VS (0%, 10%, 15%, 20%, 25%, 30%, and 35%) were produced. The investigated concrete properties were the compressive strength, the water permeability, and the concrete porosity. Artificial neural networks (ANNs) were used for prediction of the investigated properties. Feed-forward backpropagation neural networks have been used. The ANN models have been established by incorporation of the laboratory experimental data and by properly choosing the network architecture and training processes. This study shows that the use of ANN models provided a more accurate tool to capture the effects of five parameters (cement content, volcanic scoria content, water content, superplasticizer content, and curing time) on the investigated properties. This prediction makes it possible to design VS-based concretes for a desired strength, water impermeability, and porosity at any given age and replacement level. Some correlations between the investigated properties were derived from the analysed data. Furthermore, the sensitivity analysis showed that all studied parameters have a strong effect on the investigated properties. The modification of the microstructure of VS-based cement paste has been observed, as well.


Electronics ◽  
2021 ◽  
Vol 10 (22) ◽  
pp. 2761
Author(s):  
Vaios Ampelakiotis ◽  
Isidoros Perikos ◽  
Ioannis Hatzilygeroudis ◽  
George Tsihrintzis

In this paper, we present a handwritten character recognition (HCR) system that aims to recognize first-order logic handwritten formulas and create editable text files of the recognized formulas. Dense feedforward neural networks (NNs) are utilized, and their performance is examined under various training conditions and methods. More specifically, after three training algorithms (backpropagation, resilient propagation and stochastic gradient descent) had been tested, we created and trained an NN with the stochastic gradient descent algorithm, optimized by the Adam update rule, which was proved to be the best, using a trainset of 16,750 handwritten image samples of 28 × 28 each and a testset of 7947 samples. The final accuracy achieved is 90.13%. The general methodology followed consists of two stages: the image processing and the NN design and training. Finally, an application has been created that implements the methodology and automatically recognizes handwritten logic formulas. An interesting feature of the application is that it allows for creating new, user-oriented training sets and parameter settings, and thus new NN models.


1997 ◽  
Vol 9 (1) ◽  
pp. 1-42 ◽  
Author(s):  
Sepp Hochreiter ◽  
Jürgen Schmidhuber

We present a new algorithm for finding low-complexity neural networks with high generalization capability. The algorithm searches for a “flat” minimum of the error function. A flat minimum is a large connected region in weight space where the error remains approximately constant. An MDL-based, Bayesian argument suggests that flat minima correspond to “simple” networks and low expected overfitting. The argument is based on a Gibbs algorithm variant and a novel way of splitting generalization error into underfitting and overfitting error. Unlike many previous approaches, ours does not require gaussian assumptions and does not depend on a “good” weight prior. Instead we have a prior over input output functions, thus taking into account net architecture and training set. Although our algorithm requires the computation of second-order derivatives, it has backpropagation's order of complexity. Automatically, it effectively prunes units, weights, and input lines. Various experiments with feedforward and recurrent nets are described. In an application to stock market prediction, flat minimum search outperforms conventional backprop, weight decay, and “optimal brain surgeon/optimal brain damage.”


2018 ◽  
Vol 11 (2) ◽  
pp. 290-314 ◽  
Author(s):  
Joseph Awoamim Yacim ◽  
Douw Gert Brand Boshoff

Purpose The paper aims to investigate the application of particle swarm optimisation and back propagation in weights optimisation and training of artificial neural networks within the mass appraisal industry and to compare the performance with standalone back propagation, genetic algorithm with back propagation and regression models. Design/methodology/approach The study utilised linear regression modelling before the semi-log and log-log models with a sample of 3,242 single-family dwellings. This was followed by the hybrid systems in the selection of optimal attribute weights and training of the artificial neural networks. Also, the standalone back propagation algorithm was used for the network training, and finally, the performance of each model was evaluated using accuracy test statistics. Findings The study found that combining particle swarm optimisation with back propagation in global and local search for attribute weights enhances the predictive accuracy of artificial neural networks. This also enhances transparency of the process, because it shows relative importance of attributes. Research limitations/implications A robust assessment of the models’ predictive accuracy was inhibited by fewer accuracy test statistics found in the software. The research demonstrates the efficacy of combining two models in the assessment of property values. Originality/value This work demonstrated the practicability of combining particle swarm optimisation with back propagation algorithms in finding optimal weights and training of the artificial neural networks within the mass appraisal environment.


2020 ◽  
Vol 386 ◽  
pp. 8-17
Author(s):  
Gege Zhang ◽  
Gangwei Li ◽  
Weining Shen ◽  
Weidong Zhang

Sign in / Sign up

Export Citation Format

Share Document