scholarly journals Beyond gradients: Noise correlations control Hebbian plasticity to shape credit assignment

2021 ◽  
Author(s):  
Daniel Nelson Scott ◽  
Michael J Frank

Two key problems that span biological and industrial neural network research are how networks can be trained to generalize well and to minimize destructive interference between tasks. Both hinge on credit assignment, the targeting of specific network weights for change. In artificial networks, credit assignment is typically governed by gradient descent. Biological learning is thus often analyzed as a means to approximate gradients. We take the complementary perspective that biological learning rules likely confer advantages when they aren't gradient approximations. Further, we hypothesized that noise correlations, often considered detrimental, could usefully shape this learning. Indeed, we show that noise and three-factor plasticity interact to compute directional derivatives of reward, which can improve generalization, robustness to interference, and multi-task learning. This interaction also provides a method for routing learning quasi-independently of activity and connectivity, and demonstrates how biologically inspired inductive biases can be fruitfully embedded in learning algorithms.

1998 ◽  
Vol 10 (5) ◽  
pp. 1157-1178 ◽  
Author(s):  
Javier R. Movellan

This article analyzes learning in continuous stochastic neural networks defined by stochastic differential equations (SDE). In particular, it studies gradient descent learning rules to train the equilibrium solutions of these networks. A theorem is given that specifies sufficient conditions for the gradient descent learning rules to be local covariance statistics between two random variables: (1) an evaluator that is the same for all the network parameters and (2) a system variable that is independent of the learning objective. While this article focuses on continuous stochastic neural networks, the theorem applies to any other system with Boltzmann-like equilibrium distributions. The generality of the theorem suggests that instead of suppressing noise present in physical devices, a natural alternative is to use it to simplify the credit assignment problem. In deterministic networks, credit assignment requires an evaluation signal that is different for each node in the network. Surprisingly, when noise is not suppressed, all that is needed is an evaluator that is the same for the entire network and a local Hebbian signal. This modularization of signals greatly simplifies hardware and software implementations. The article shows how the theorem applies to four different learning objectives that span supervised, reinforcement, and unsupervised problems: (1) regression, (2) density estimation, (3) risk minimization, and (4) information maximization. Simulations, implementation issues, and implications for computational neuroscience are discussed.


Optimization ◽  
2013 ◽  
Vol 64 (2) ◽  
pp. 389-407 ◽  
Author(s):  
L. Minchenko ◽  
A. Tarakanov

1992 ◽  
Vol 35 (3) ◽  
pp. 371-375
Author(s):  
Nezam Iraniparast

AbstractA method will be introduced to solve problems utt — uss = h(s, t), u(t,t) - u(1+t,1 - t), u(s,0) = g(s), u(1,1) = 0 and for (s, t) in the characteristic triangle R = {(s,t) : t ≤ s ≤ 2 — t, 0 ≤ t ≤ 1}. Here represent the directional derivatives of u in the characteristic directions e1 = (— 1, — 1) and e2 = (1, — 1), respectively. The method produces the symmetric Green's function of Kreith [1] in both cases.


Author(s):  
Sonia Carvalho ◽  
Pedro Freitas

In recent papers, S. Carvalho and P. J. Freitas obtained formulas for directional derivatives, of all orders, of the immanant and of the m-th $\xi$-symmetric tensor power of an operator and a matrix, when $\xi$ is a character of the full symmetric group. The operator bound norm of these derivatives was also calculated. In this paper similar results are established for generalized matrix functions and for every symmetric tensor power.


2013 ◽  
Vol 43 (2) ◽  
pp. 121-136
Author(s):  
LiWei ZHANG ◽  
XianTao XIAO ◽  
Ning ZHANG

Sign in / Sign up

Export Citation Format

Share Document