scholarly journals Computational Principles of Supervised Learning in the Cerebellum

2018 ◽  
Vol 41 (1) ◽  
pp. 233-253 ◽  
Author(s):  
Jennifer L. Raymond ◽  
Javier F. Medina

Supervised learning plays a key role in the operation of many biological and artificial neural networks. Analysis of the computations underlying supervised learning is facilitated by the relatively simple and uniform architecture of the cerebellum, a brain area that supports numerous motor, sensory, and cognitive functions. We highlight recent discoveries indicating that the cerebellum implements supervised learning using the following organizational principles: ( a) extensive preprocessing of input representations (i.e., feature engineering), ( b) massively recurrent circuit architecture, ( c) linear input–output computations, ( d) sophisticated instructive signals that can be regulated and are predictive, ( e) adaptive mechanisms of plasticity with multiple timescales, and ( f) task-specific hardware specializations. The principles emerging from studies of the cerebellum have striking parallels with those in other brain areas and in artificial neural networks, as well as some notable differences, which can inform future research on supervised learning and inspire next-generation machine-based algorithms.

Author(s):  
Suraphan Thawornwong ◽  
David Enke

During the last few years there has been growing literature on applications of artificial neural networks to business and financial domains. In fact, a great deal of attention has been placed in the area of stock return forecasting. This is due to the fact that once artificial neural network applications are successful, monetary rewards will be substantial. Many studies have reported promising results in successfully applying various types of artificial neural network architectures for predicting stock returns. This chapter reviews and discusses various neural network research methodologies used in 45 journal articles that attempted to forecast stock returns. Modeling techniques and suggestions from the literature are also compiled and addressed. The results show that artificial neural networks are an emerging and promising computational technology that will continue to be a challenging tool for future research.


Author(s):  
WEI HUANG ◽  
K. K. LAI ◽  
Y. NAKAMORI ◽  
SHOUYANG WANG

Forecasting exchange rates is an important financial problem that is receiving increasing attention especially because of its difficulty and practical applications. Artificial neural networks (ANNs) have been widely used as a promising alternative approach for a forecasting task because of several distinguished features. Research efforts on ANNs for forecasting exchange rates are considerable. In this paper, we attempt to provide a survey of research in this area. Several design factors significantly impact the accuracy of neural network forecasts. These factors include the selection of input variables, preparing data, and network architecture. There is no consensus about the factors. In different cases, various decisions have their own effectiveness. We also describe the integration of ANNs with other methods and report the comparison between performances of ANNs and those of other forecasting methods, and finding mixed results. Finally, the future research directions in this area are discussed.


Author(s):  
Darryl Charles ◽  
Colin Fyfe ◽  
Daniel Livingstone ◽  
Stephen McGlinchey

In this chapter we will look at supervised learning in more detail, beginning with one of the simplest (and earliest) supervised neural learning algorithms – the Delta Rule. The objectives of this chapter are to provide a solid grounding in the theory and practice of problem solving with artificial neural networks – and an appreciation of some of the challenges and practicalities involved in their use.


2021 ◽  
pp. 1-12
Author(s):  
Salvador Moral-Cuadra ◽  
Miguel Á. Solano-Sánchez ◽  
Antonio Menor-Campos ◽  
Tomás López-Guzmán

Sign in / Sign up

Export Citation Format

Share Document