Neurons learn by predicting future activity
AbstractPlasticity mechanisms in the brain are still not well understood. Here we demonstrate that the ability of a neuron to predict its future activity may provide an effective mechanism for learning in the brain. We show that comparing a neuron’s predicted activity with the actual activity provides a useful learning signal for modifying synaptic weights. Interestingly, this predictive learning rule can be derived from a metabolic principle, where neurons need to minimize their own synaptic activity (cost), while maximizing their impact on local blood supply by recruiting other neurons. This reveals an unexpected connection that learning in neural networks could result from simply maximizing the energy balance by each neuron. We validated this predictive learning rule in neural network simulations and in data recorded from awake animals. We found that in the sensory cortex it is indeed possible to predict a neuron’s activity ∼10-20ms into the future. Moreover, in response to stimuli, cortical neurons changed their firing rate to minimize surprise: i.e. the difference between actual and expected activity, as predicted by our model. Our results also suggest that spontaneous brain activity provides “training data” for neurons to learn to predict cortical dynamics. Thus, this work demonstrates that the ability of a neuron to predict its future inputs could be an important missing element to understand computation in the brain.Significance statementUnderstanding how the brain learns may lead to machines with human-like intellectual capacities. Donald Hebb proposed the influential idea that the brain’s learning algorithm is based on correlated firing: a.k.a. ‘cells that fire together wire together’. However, Hebb’s rule and other biologically inspired learning algorithms are still likely missing some important components needed to replicate brain learning mechanisms. Here we provide evidence for a predictive learning rule: a neuron predicts its future activity and adjusts its incoming weights to minimize surprise: i.e. the difference between actual and expected activity. Interestingly, we show that such a rule is equivalent to maximizing the neuron’s energy balance, which could be paraphrased as: cells adjust their weights to achieve maximum impact with minimum activity.