hebbian plasticity
Recently Published Documents


TOTAL DOCUMENTS

151
(FIVE YEARS 61)

H-INDEX

24
(FIVE YEARS 5)

2022 ◽  
Vol 5 (1) ◽  
Author(s):  
Takuya Isomura ◽  
Hideaki Shimazaki ◽  
Karl J. Friston

AbstractThis work considers a class of canonical neural networks comprising rate coding models, wherein neural activity and plasticity minimise a common cost function—and plasticity is modulated with a certain delay. We show that such neural networks implicitly perform active inference and learning to minimise the risk associated with future outcomes. Mathematical analyses demonstrate that this biological optimisation can be cast as maximisation of model evidence, or equivalently minimisation of variational free energy, under the well-known form of a partially observed Markov decision process model. This equivalence indicates that the delayed modulation of Hebbian plasticity—accompanied with adaptation of firing thresholds—is a sufficient neuronal substrate to attain Bayes optimal inference and control. We corroborated this proposition using numerical analyses of maze tasks. This theory offers a universal characterisation of canonical neural networks in terms of Bayesian belief updating and provides insight into the neuronal mechanisms underlying planning and adaptive behavioural control.


2022 ◽  
Author(s):  
Alberto Lazari ◽  
Piergiorgio Salvan ◽  
Michiel Cottaar ◽  
Daniel Papp ◽  
Matthew FS Rushworth ◽  
...  

Synaptic plasticity is required for learning and follows Hebb's Rule, the computational principle underpinning associative learning. In recent years, a complementary type of brain plasticity has been identified in myelinated axons, which make up the majority of brain's white matter. Like synaptic plasticity, myelin plasticity is required for learning, but it is unclear whether it is Hebbian or whether it follows different rules. Here, we provide evidence that white matter plasticity operates following Hebb's Rule in humans. Across two experiments, we find that co-stimulating cortical areas to induce Hebbian plasticity leads to relative increases in cortical excitability and associated increases in a myelin marker within the stimulated fiber bundle. We conclude that Hebbian plasticity extends beyond synaptic changes, and can be observed in human white matter fibers.


2021 ◽  
Vol 17 (12) ◽  
pp. e1009681
Author(s):  
Michiel W. H. Remme ◽  
Urs Bergmann ◽  
Denis Alevi ◽  
Susanne Schreiber ◽  
Henning Sprekeler ◽  
...  

Systems memory consolidation involves the transfer of memories across brain regions and the transformation of memory content. For example, declarative memories that transiently depend on the hippocampal formation are transformed into long-term memory traces in neocortical networks, and procedural memories are transformed within cortico-striatal networks. These consolidation processes are thought to rely on replay and repetition of recently acquired memories, but the cellular and network mechanisms that mediate the changes of memories are poorly understood. Here, we suggest that systems memory consolidation could arise from Hebbian plasticity in networks with parallel synaptic pathways—two ubiquitous features of neural circuits in the brain. We explore this hypothesis in the context of hippocampus-dependent memories. Using computational models and mathematical analyses, we illustrate how memories are transferred across circuits and discuss why their representations could change. The analyses suggest that Hebbian plasticity mediates consolidation by transferring a linear approximation of a previously acquired memory into a parallel pathway. Our modelling results are further in quantitative agreement with lesion studies in rodents. Moreover, a hierarchical iteration of the mechanism yields power-law forgetting—as observed in psychophysical studies in humans. The predicted circuit mechanism thus bridges spatial scales from single cells to cortical areas and time scales from milliseconds to years.


2021 ◽  
Author(s):  
Daniel Nelson Scott ◽  
Michael J Frank

Two key problems that span biological and industrial neural network research are how networks can be trained to generalize well and to minimize destructive interference between tasks. Both hinge on credit assignment, the targeting of specific network weights for change. In artificial networks, credit assignment is typically governed by gradient descent. Biological learning is thus often analyzed as a means to approximate gradients. We take the complementary perspective that biological learning rules likely confer advantages when they aren't gradient approximations. Further, we hypothesized that noise correlations, often considered detrimental, could usefully shape this learning. Indeed, we show that noise and three-factor plasticity interact to compute directional derivatives of reward, which can improve generalization, robustness to interference, and multi-task learning. This interaction also provides a method for routing learning quasi-independently of activity and connectivity, and demonstrates how biologically inspired inductive biases can be fruitfully embedded in learning algorithms.


Nature ◽  
2021 ◽  
Author(s):  
Jason J. Moore ◽  
Jesse D. Cushman ◽  
Lavanya Acharya ◽  
Briana Popeney ◽  
Mayank R. Mehta
Keyword(s):  

2021 ◽  
Author(s):  
Elena Kutsarova ◽  
Anne Schohl ◽  
Martin Munz ◽  
Alex Wang ◽  
Yuan Yuan Zhang ◽  
...  

During development, patterned neural activity in input neurons innervating their target, instructs topographic map refinement. Axons from adjacent neurons, firing with similar patterns of neural activity, converge onto target neurons and stabilize their synapses with these postsynaptic partners (Hebbian plasticity). On the other hand, non-correlated firing of inputs promotes synaptic weakening and exploratory axonal growth (Stentian plasticity). We used visual stimulation to control the visually-evoked correlation structure of neural activity in ectopic ipsilaterally projecting (ipsi) retinal ganglion cell axons with respect to their neighboring contralateral eye inputs in the optic tectum of albino Xenopus laevis tadpoles. Multiphoton imaging of the ipsi axons in the live tadpole, combined with manipulation of brain-derived neurotrophic factor (BDNF) signaling, revealed that presynaptic p75NTR and TrkB both promoted axonal branch addition during Stentian plasticity, whereas predominantly postsynaptic BDNF signaling mediated activity-dependent Hebbian suppression of axon branch addition. Additionally, we found that BDNF signaling is required for local suppression of branch loss induced by correlated firing.


2021 ◽  
Author(s):  
Wujie Zhang ◽  
Jacqueline Gottlieb ◽  
Kenneth D Miller

When monkeys learn to group visual stimuli into arbitrary categories, lateral intraparietal area (LIP) neurons become category-selective. Surprisingly, the representations of learned categories are overwhelmingly biased: nearly all LIP neurons in a given animal prefer the same category over other behaviorally equivalent categories. We propose a model where such biased representations develop through the interplay between Hebbian plasticity and the recurrent connectivity of LIP. In this model, two separable processes of positive feedback unfold in parallel: in one, category selectivity emerges from competition between prefrontal inputs; in the other, bias develops due to lateral interactions among LIP neurons. This model reproduces the levels of category selectivity and bias observed under a variety of conditions, as well as the redevelopment of bias after monkeys learn redefined categories. It predicts that LIP receptive fields would spatially cluster by preferred category, which we experimentally confirm. In summary, our model reveals a mechanism by which LIP learns abstract representations and assigns meaning to sensory inputs.


2021 ◽  
Vol 28 (10) ◽  
pp. 371-389
Author(s):  
Felippe E. Amorim ◽  
Renata L. Chapot ◽  
Thiago C. Moulin ◽  
Jonathan L.C. Lee ◽  
Olavo B. Amaral

Remembering is not a static process: When retrieved, a memory can be destabilized and become prone to modifications. This phenomenon has been demonstrated in a number of brain regions, but the neuronal mechanisms that rule memory destabilization and its boundary conditions remain elusive. Using two distinct computational models that combine Hebbian plasticity and synaptic downscaling, we show that homeostatic plasticity can function as a destabilization mechanism, accounting for behavioral results of protein synthesis inhibition upon reactivation with different re-exposure times. Furthermore, by performing systematic reviews, we identify a series of overlapping molecular mechanisms between memory destabilization and synaptic downscaling, although direct experimental links between both phenomena remain scarce. In light of these results, we propose a theoretical framework where memory destabilization can emerge as an epiphenomenon of homeostatic adaptations prompted by memory retrieval.


PLoS Biology ◽  
2021 ◽  
Vol 19 (6) ◽  
pp. e3001324
Author(s):  
Markus K. Klose ◽  
Paul J. Shaw

Circadian rhythms help animals synchronize motivated behaviors to match environmental demands. Recent evidence indicates that clock neurons influence the timing of behavior by differentially altering the activity of a distributed network of downstream neurons. Downstream circuits can be remodeled by Hebbian plasticity, synaptic scaling, and, under some circumstances, activity-dependent addition of cell surface receptors; the role of this receptor respecification phenomena is not well studied. We demonstrate that high sleep pressure quickly reprograms the wake-promoting large ventrolateral clock neurons to express the pigment dispersing factor receptor (PDFR). The addition of this signaling input into the circuit is associated with increased waking and early mating success. The respecification of PDFR in both young and adult large ventrolateral neurons requires 2 dopamine (DA) receptors and activation of the transcriptional regulator nejire (cAMP response element-binding protein [CREB]). These data identify receptor respecification as an important mechanism to sculpt circuit function to match sleep levels with demand.


2021 ◽  
pp. 1-44
Author(s):  
David Lipshutz ◽  
Yanis Bahroun ◽  
Siavash Golkar ◽  
Anirvan M. Sengupta ◽  
Dmitri B. Chklovskii

Abstract Cortical pyramidal neurons receive inputs from multiple distinct neural populations and integrate these inputs in separate dendritic compartments. We explore the possibility that cortical microcircuits implement canonical correlation analysis (CCA), an unsupervised learning method that projects the inputs onto a common subspace so as to maximize the correlations between the projections. To this end, we seek a multichannel CCA algorithm that can be implemented in a biologically plausible neural network. For biological plausibility, we require that the network operates in the online setting and its synaptic update rules are local. Starting from a novel CCA objective function, we derive an online optimization algorithm whose optimization steps can be implemented in a single-layer neural network with multicompartmental neu rons and local non-Hebbian learning rules. We also derive an extension of our online CCA algorithm with adaptive output rank and output whitening. Interestingly, the extension maps onto a neural network whose neural architecture and synaptic updates resemble neural circuitry and non-Hebbian plasticity observed in the cortex.


Sign in / Sign up

Export Citation Format

Share Document