Supervised Learning with Artificial Neural Networks

Author(s):  
Darryl Charles ◽  
Colin Fyfe ◽  
Daniel Livingstone ◽  
Stephen McGlinchey

In this chapter we will look at supervised learning in more detail, beginning with one of the simplest (and earliest) supervised neural learning algorithms – the Delta Rule. The objectives of this chapter are to provide a solid grounding in the theory and practice of problem solving with artificial neural networks – and an appreciation of some of the challenges and practicalities involved in their use.

Author(s):  
Lluís A. Belanche Muñoz

The view of artificial neural networks as adaptive systems has lead to the development of ad-hoc generic procedures known as learning rules. The first of these is the Perceptron Rule (Rosenblatt, 1962), useful for single layer feed-forward networks and linearly separable problems. Its simplicity and beauty, and the existence of a convergence theorem made it a basic departure point in neural learning algorithms. This algorithm is a particular case of the Widrow-Hoff or delta rule (Widrow & Hoff, 1960), applicable to continuous networks with no hidden layers with an error function that is quadratic in the parameters.


2018 ◽  
Vol 41 (1) ◽  
pp. 233-253 ◽  
Author(s):  
Jennifer L. Raymond ◽  
Javier F. Medina

Supervised learning plays a key role in the operation of many biological and artificial neural networks. Analysis of the computations underlying supervised learning is facilitated by the relatively simple and uniform architecture of the cerebellum, a brain area that supports numerous motor, sensory, and cognitive functions. We highlight recent discoveries indicating that the cerebellum implements supervised learning using the following organizational principles: ( a) extensive preprocessing of input representations (i.e., feature engineering), ( b) massively recurrent circuit architecture, ( c) linear input–output computations, ( d) sophisticated instructive signals that can be regulated and are predictive, ( e) adaptive mechanisms of plasticity with multiple timescales, and ( f) task-specific hardware specializations. The principles emerging from studies of the cerebellum have striking parallels with those in other brain areas and in artificial neural networks, as well as some notable differences, which can inform future research on supervised learning and inspire next-generation machine-based algorithms.


2011 ◽  
Vol 17 (3) ◽  
pp. 340-347 ◽  
Author(s):  
S. Umit Dikmen ◽  
Murat Sonmez

Artificial Neural Networks (ANN) is a problem solving technique imitating the basic working principles of the human brain. The formwork labour cost constitutes an important part within the costs of the reinforced concrete frame buildings. This study suggests a method based on artificial neural networks developed for estimating the required manhours for the formwork activity of such buildings. The introduced method has been verified in the study with reference to the test conducted involving two case studies. In all cases, the model produced results reasonably close to actual field measurements. The model is a simple and quick tool for the estimators and planners to aid them in their work. Santrauka Dirbtiniai neuroniniai tinklai (DNT) – tai problemų sprendimo metodas, imituojantis pagrindinius žmogaus smegenų veiklos principus. Statant gelžbetoninius karkasinius pastatus, nemažą sąnaudų dalį sudaro klojinių ruošimas. Šiame tyrime siūlomas dirbtiniais neuroniniais tinklais pagrįstas metodas, kurio paskirtis – apskaičiuoti, kiek žmogaus darbo valandų reikės ruošti klojinius tokiuose pastatuose. Pristatomas metodas tyrimo metu patikrintas remiantis bandymu, susijusiu su dviem atvejo tyrimais. Visais atvejais modelio pateikti rezultatai buvo gana artimi faktiniams matavimams. Modelis – tai paprastas ir greitai naudojamas įrankis, kuris pravers sąmatininkams ir planuotojams.


Sign in / Sign up

Export Citation Format

Share Document