scholarly journals On-Line Learning Theory of Soft Committee Machines with Correlated Hidden Units –Steepest Gradient Descent and Natural Gradient Descent–

2003 ◽  
Vol 72 (4) ◽  
pp. 805-810 ◽  
Author(s):  
Masato Inoue ◽  
Hyeyoung Park ◽  
Masato Okada
1998 ◽  
Vol 81 (24) ◽  
pp. 5461-5464 ◽  
Author(s):  
Magnus Rattray ◽  
David Saad ◽  
Shun-ichi Amari

2000 ◽  
Vol 12 (4) ◽  
pp. 881-901 ◽  
Author(s):  
Tom Heskes

Several studies have shown that natural gradient descent for on-line learning is much more efficient than standard gradient descent. In this article, we derive natural gradients in a slightly different manner and discuss implications for batch-mode learning and pruning, linking them to existing algorithms such as Levenberg-Marquardt optimization and optimal brain surgeon. The Fisher matrix plays an important role in all these algorithms. The second half of the article discusses a layered approximation of the Fisher matrix specific to multilayered perceptrons. Using this approximation rather than the exact Fisher matrix, we arrive at much faster “natural” learning algorithms and more robust pruning procedures.


2002 ◽  
Vol 14 (7) ◽  
pp. 1723-1738 ◽  
Author(s):  
Nicol N. Schraudolph

We propose a generic method for iteratively approximating various second-order gradient steps—-Newton, Gauss-Newton, Levenberg-Marquardt, and natural gradient—-in linear time per iteration, using special curvature matrix-vector products that can be computed in O(n). Two recent acceleration techniques for on-line learning, matrix momentum and stochastic meta-descent (SMD), implement this approach. Since both were originally derived by very different routes, this offers fresh insight into their operation, resulting in further improvements to SMD.


2005 ◽  
Vol 36 (12) ◽  
pp. 63-74
Author(s):  
Seiji Miyoshi ◽  
Kazuyuki Hara ◽  
Masato Okada

2010 ◽  
Vol 24 (2) ◽  
pp. 91-101 ◽  
Author(s):  
Juliana Yordanova ◽  
Rolf Verleger ◽  
Ullrich Wagner ◽  
Vasil Kolev

The objective of the present study was to evaluate patterns of implicit processing in a task where the acquisition of explicit and implicit knowledge occurs simultaneously. The number reduction task (NRT) was used as having two levels of organization, overt and covert, where the covert level of processing is associated with implicit associative and implicit procedural learning. One aim was to compare these two types of implicit processes in the NRT when sleep was or was not introduced between initial formation of task representations and subsequent NRT processing. To assess the effects of different sleep stages, two sleep groups (early- and late-night groups) were used where initial training of the task was separated from subsequent retest by 3 h full of predominantly slow wave sleep (SWS) or rapid eye movement (REM) sleep. In two no-sleep groups, no interval was introduced between initial and subsequent NRT performance. A second aim was to evaluate the interaction between procedural and associative implicit learning in the NRT. Implicit associative learning was measured by the difference between the speed of responses that could or could not be predicted by the covert abstract regularity of the task. Implicit procedural on-line learning was measured by the practice-based increased speed of performance with time on task. Major results indicated that late-night sleep produced a substantial facilitation of implicit associations without modifying individual ability for explicit knowledge generation or for procedural on-line learning. This was evidenced by the higher rate of subjects who gained implicit knowledge of abstract task structure in the late-night group relative to the early-night and no-sleep groups. Independently of sleep, gain of implicit associative knowledge was accompanied by a relative slowing of responses to unpredictable items suggesting reciprocal interactions between associative and motor procedural processes within the implicit system. These observations provide evidence for the separability and interactions of different patterns of processing within implicit memory.


Sign in / Sign up

Export Citation Format

Share Document