Self-Organization of Local Cortical Circuits and Cortical Orientation Maps: A Nonlinear Hebbian Model of the Visual Cortex with Adaptive Lateral Couplings

2001 ◽  
Vol 56 (5-6) ◽  
pp. 464-478 ◽  
Author(s):  
Thomas Burger ◽  
Wolfgang Lang

A nonlinear, recurrent neural network model of the visual cortex is presented. Orientation maps emerge from adaptable afferent as well as plastic local intracortical circuits driven by random input stimuli. Lateral coupling structures self-organize into DOG profiles under the influence of pronounced emerging cortical activity blobs. The model’s simplified architecture and features are modeled to largely mimik neurobiological findings.

2000 ◽  
Vol 55 (3-4) ◽  
pp. 282-291
Author(s):  
Christoph Bauer ◽  
Thomas Burger ◽  
Martin Stetter ◽  
Elmar W. Lang

Abstract A neural network model with incremental Hebbian learning of afferent and lateral synaptic couplings is proposed,which simulates the activity-dependent self-organization of grating cells in upper layers of striate cortex. These cells, found in areas V1 and V2 of the visual cortex of monkeys, respond vigorously and exclusively to bar gratings of a preferred orientation and periodicity. Response behavior to varying contrast and to an increasing number of bars in the grating show threshold and saturation effects. Their location with respect to the underlying orientation map and their nonlinear response behavior are investigated. The number of emerging grating cells is controlled in the model by the range and strength of the lateral coupling structure.


2017 ◽  
Author(s):  
Michelle J Wu ◽  
Johan OL Andreasson ◽  
Wipapat Kladwang ◽  
William J Greenleaf ◽  
Rhiju Das ◽  
...  

AbstractRNA is a functionally versatile molecule that plays key roles in genetic regulation and in emerging technologies to control biological processes. Computational models of RNA secondary structure are well-developed but often fall short in making quantitative predictions of the behavior of multi-RNA complexes. Recently, large datasets characterizing hundreds of thousands of individual RNA complexes have emerged as rich sources of information about RNA energetics. Meanwhile, advances in machine learning have enabled the training of complex neural networks from large datasets. Here, we assess whether a recurrent neural network model, Ribonet, can learn from high-throughput binding data, using simulation and experimental studies to test model accuracy but also determine if they learned meaningful information about the biophysics of RNA folding. We began by evaluating the model on energetic values predicted by the Turner model to assess whether the neural network could learn a representation that recovered known biophysical principles. First, we trained Ribonet to predict the simulated free energy of an RNA in complex with multiple input RNAs. Our model accurately predicts free energies of new sequences but also shows evidence of having learned base pairing information, as assessed by in silico double mutant analysis. Next, we extended this model to predict the simulated affinity between an arbitrary RNA sequence and a reporter RNA. While these more indirect measurements precluded the learning of basic principles of RNA biophysics, the resulting model achieved sub-kcal/mol accuracy and enabled design of simple RNA input responsive riboswitches with high activation ratios predicted by the Turner model from which the training data were generated. Finally, we compiled and trained on an experimental dataset comprising over 600,000 experimental affinity measurements published on the Eterna open laboratory. Though our tests revealed that the model likely did not learn a physically realistic representation of RNA interactions, it nevertheless achieved good performance of 0.76 kcal/mol on test sets with the application of transfer learning and novel sequence-specific data augmentation strategies. These results suggest that recurrent neural network architectures, despite being naïve to the physics of RNA folding, have the potential to capture complex biophysical information. However, more diverse datasets, ideally involving more direct free energy measurements, may be necessary to train de novo predictive models that are consistent with the fundamentals of RNA biophysics.Author SummaryThe precise design of RNA interactions is essential to gaining greater control over RNA-based biotechnology tools, including designer riboswitches and CRISPR-Cas9 gene editing. However, the classic model for energetics governing these interactions fails to quantitatively predict the behavior of RNA molecules. We developed a recurrent neural network model, Ribonet, to quantitatively predict these values from sequence alone. Using simulated data, we show that this model is able to learn simple base pairing rules, despite having no a priori knowledge about RNA folding encoded in the network architecture. This model also enables design of new switching RNAs that are predicted to be effective by the “ground truth” simulated model. We applied transfer learning to retrain Ribonet using hundreds of thousands of RNA-RNA affinity measurements and demonstrate simple data augmentation techniques that improve model performance. At the same time, data diversity currently available set limits on Ribonet’s accuracy. Recurrent neural networks are a promising tool for modeling nucleic acid biophysics and may enable design of complex RNAs for novel applications.


2021 ◽  
Vol 193 (12) ◽  
Author(s):  
Salar Valizadeh Moghadam ◽  
Ahmad Sharafati ◽  
Hajar Feizi ◽  
Seyed Mohammad Saeid Marjaie ◽  
Seyed Babak Haji Seyed Asadollah ◽  
...  

Author(s):  
C. Fernando Mugarra Gonzalez ◽  
Stanisław Jankowski ◽  
Jacek J. Dusza ◽  
Vicente Carrilero López ◽  
Javier M. Duart Clemente

Sign in / Sign up

Export Citation Format

Share Document