scholarly journals Young and Aged Neuronal Tissue Dynamics With a Simplified Neuronal Patch Cellular Automata Model

2022 ◽  
Vol 15 ◽  
Author(s):  
Reinier Xander A. Ramos ◽  
Jacqueline C. Dominguez ◽  
Johnrob Y. Bantang

Realistic single-cell neuronal dynamics are typically obtained by solving models that involve solving a set of differential equations similar to the Hodgkin-Huxley (HH) system. However, realistic simulations of neuronal tissue dynamics —especially at the organ level, the brain— can become intractable due to an explosion in the number of equations to be solved simultaneously. Consequently, such efforts of modeling tissue- or organ-level systems require a lot of computational time and the need for large computational resources. Here, we propose to utilize a cellular automata (CA) model as an efficient way of modeling a large number of neurons reducing both the computational time and memory requirement. First, a first-order approximation of the response function of each HH neuron is obtained and used as the response-curve automaton rule. We then considered a system where an external input is in a few cells. We utilize a Moore neighborhood (both totalistic and outer-totalistic rules) for the CA system used. The resulting steady-state dynamics of a two-dimensional (2D) neuronal patch of size 1, 024 × 1, 024 cells can be classified into three classes: (1) Class 0–inactive, (2) Class 1–spiking, and (3) Class 2–oscillatory. We also present results for different quasi-3D configurations starting from the 2D lattice and show that this classification is robust. The numerical modeling approach can find applications in the analysis of neuronal dynamics in mesoscopic scales in the brain (patch or regional). The method is applied to compare the dynamical properties of the young and aged population of neurons. The resulting dynamics of the aged population shows higher average steady-state activity 〈a(t → ∞)〉 than the younger population. The average steady-state activity 〈a(t → ∞)〉 is significantly simplified when the aged population is subjected to external input. The result conforms to the empirical data with aged neurons exhibiting higher firing rates as well as the presence of firing activity for aged neurons stimulated with lower external current.

Entropy ◽  
2020 ◽  
Vol 22 (5) ◽  
pp. 552 ◽  
Author(s):  
Thomas Parr ◽  
Noor Sajid ◽  
Karl J. Friston

The segregation of neural processing into distinct streams has been interpreted by some as evidence in favour of a modular view of brain function. This implies a set of specialised ‘modules’, each of which performs a specific kind of computation in isolation of other brain systems, before sharing the result of this operation with other modules. In light of a modern understanding of stochastic non-equilibrium systems, like the brain, a simpler and more parsimonious explanation presents itself. Formulating the evolution of a non-equilibrium steady state system in terms of its density dynamics reveals that such systems appear on average to perform a gradient ascent on their steady state density. If this steady state implies a sufficiently sparse conditional independency structure, this endorses a mean-field dynamical formulation. This decomposes the density over all states in a system into the product of marginal probabilities for those states. This factorisation lends the system a modular appearance, in the sense that we can interpret the dynamics of each factor independently. However, the argument here is that it is factorisation, as opposed to modularisation, that gives rise to the functional anatomy of the brain or, indeed, any sentient system. In the following, we briefly overview mean-field theory and its applications to stochastic dynamical systems. We then unpack the consequences of this factorisation through simple numerical simulations and highlight the implications for neuronal message passing and the computational architecture of sentience.


1989 ◽  
Vol 1 (3) ◽  
pp. 201-222 ◽  
Author(s):  
Adam N. Mamelak ◽  
J. Allan Hobson

Bizarreness is a cognitive feature common to REM sleep dreams, which can be easily measured. Because bizarreness is highly specific to dreaming, we propose that it is most likely brought about by changes in neuronal activity that are specific to REM sleep. At the level of the dream plot, bizarreness can be defined as either discontinuity or incongruity. In addition, the dreamer's thoughts about the plot may be logically deficient. We propose that dream bizarreness is the cognitive concomitant of two kinds of changes in neuronal dynamics during REM sleep. One is the disinhibition of forebrain networks caused by the withdrawal of the modulatory influences of norepinephrine (NE) and serotonin (5HT) in REM sleep, secondary to cessation of firing of locus coeruleus and dorsal raphe neurons. This aminergic demodulation can be mathematically modeled as a shift toward increased error at the outputs from neural networks, and these errors might be represented cognitively as incongruities and/or discontinuities. We also consider the possibility that discontinuities are the cognitive concomitant of sudden bifurcations or “jumps” in the responses of forebrain neuronal networks. These bifurcations are caused by phasic discharge of pontogeniculooccipital (PGO) neurons during REM sleep, providing a source of cholinergic modulation to the forebrain which could evoke unpredictable network responses. When phasic PGO activity stops, the resultant activity in the brain may be wholly unrelated to patterns of activity dominant before such phasic stimulation began. Mathematically such sudden shifts from one pattern of activity to a second, unrelated one is called a bifurcation. We propose that the neuronal bifurcations brought about by PGO activity might be represented cognitively as bizarre discontinuities of dream plot. We regard these proposals as preliminary attempts to model the relationship between dream cognition and REM sleep neurophysiology. This neurophysiological model of dream bizarreness may also prove useful in understanding the contributions of REM sleep to the developmental and experiential plasticity of the cerebral cortex.


1989 ◽  
Vol 479 (1) ◽  
pp. 162-166 ◽  
Author(s):  
Catherine A. Sei ◽  
Robert Richard ◽  
Robert M. Dores

2010 ◽  
Vol 3 (6) ◽  
pp. 1555-1568 ◽  
Author(s):  
B. Mijling ◽  
O. N. E. Tuinder ◽  
R. F. van Oss ◽  
R. J. van der A

Abstract. The Ozone Profile Algorithm (OPERA), developed at KNMI, retrieves the vertical ozone distribution from nadir spectral satellite measurements of back scattered sunlight in the ultraviolet and visible wavelength range. To produce consistent global datasets the algorithm needs to have good global performance, while short computation time facilitates the use of the algorithm in near real time applications. To test the global performance of the algorithm we look at the convergence behaviour as diagnostic tool of the ozone profile retrievals from the GOME instrument (on board ERS-2) for February and October 1998. In this way, we uncover different classes of retrieval problems, related to the South Atlantic Anomaly, low cloud fractions over deserts, desert dust outflow over the ocean, and the intertropical convergence zone. The influence of the first guess and the external input data including the ozone cross-sections and the ozone climatologies on the retrieval performance is also investigated. By using a priori ozone profiles which are selected on the expected total ozone column, retrieval problems due to anomalous ozone distributions (such as in the ozone hole) can be avoided. By applying the algorithm adaptations the convergence statistics improve considerably, not only increasing the number of successful retrievals, but also reducing the average computation time, due to less iteration steps per retrieval. For February 1998, non-convergence was brought down from 10.7% to 2.1%, while the mean number of iteration steps (which dominates the computational time) dropped 26% from 5.11 to 3.79.


2019 ◽  
Vol 3 (1) ◽  
pp. 26 ◽  
Author(s):  
Vishnu Sidaarth Suresh

Load flow studies are carried out in order to find a steady state solution of a power system network. It is done to continuously monitor the system and decide upon future expansion of the system. The parameters of the system monitored are voltage magnitude, voltage angle, active and reactive power. This paper presents techniques used in order to obtain such parameters for a standard IEEE – 30 bus and IEEE-57 bus network and makes a comparison into the differences with regard to computational time and effectiveness of each solver


2021 ◽  
Author(s):  
Ge Zhang ◽  
Yan Cui ◽  
Yangsong Zhang ◽  
Hefei Cao ◽  
Guanyu Zhou ◽  
...  

AbstractPeriodic visual stimulation can induce stable steady-state visual evoked potentials (SSVEPs) distributed in multiple brain regions and has potential applications in both neural engineering and cognitive neuroscience. However, the underlying dynamic mechanisms of SSVEPs at the whole-brain level are still not completely understood. Here, we addressed this issue by simulating the rich dynamics of SSVEPs with a large-scale brain model designed with constraints of neuroimaging data acquired from the human brain. By eliciting activity of the occipital areas using an external periodic stimulus, our model was capable of replicating both the spatial distributions and response features of SSVEPs that were observed in experiments. In particular, we confirmed that alpha-band (8-12 Hz) stimulation could evoke stronger SSVEP responses; this frequency sensitivity was due to nonlinear resonance and could be modulated by endogenous factors in the brain. Interestingly, the stimulus-evoked brain networks also exhibited significant superiority in topological properties near this frequency-sensitivity range, and stronger SSVEP responses were demonstrated to be supported by more efficient functional connectivity at the neural activity level. These findings not only provide insights into the mechanistic understanding of SSVEPs at the whole-brain level but also indicate a bright future for large-scale brain modeling in characterizing the complicated dynamics and functions of the brain.


1997 ◽  
Vol 7 (6) ◽  
pp. 441-451
Author(s):  
J. Kröller ◽  
F. Behrens ◽  
V.V. Marlinsky

Experiments in two awake untrained squirrel monkeys were performed to study the velocity storage mechanism during fast rise of OKN slow phase velocity. This was done by testing the monkey’s capability to perform OKN in response to a stationary-appearing stroboscopically illuminated stripe pattern of a horizontally rotating drum. Nystagmus was initially elicited during constant illumination lasting between 0.6 and 25 s. The periodicity of the stripe pattern was 2.37°. When after the constant light the flash illumination was switched on again, two types of behavior could occur, depending on the length of the constant light interval (CLI): 1) when the CLI was shorter than a threshold value of 6.2 seconds, the OKN ceased under the flash stimulation. Then a “post-OKN” occurred that increased with the length of the CLIs, indicating that the intermittently illuminated pattern did not provoke fixation suppression of OKN aftereffects. 2) when the CLI was above threshold, the OKN continued under the flash light: it will he called “apparent movement OKN.” The threshold CLI between the type 1 and the type 2 response did not depend on drum velocities between 21.5°/s and 71.3°/s. The average gain of the apparent movement OKN was 0.83 ± 0.04; gain and stability of slow phase eye movement velocity did not deviate systematically from the usually elicited OKN. OKAN after apparent movement OKN did not deviate from OKAN after constantly illuminated moving patterns. In response to the OKN initiation by a constantly illuminated pattern up to pattern velocities of 100°/s, the OKN steady state gain was reached within the first 2 or 3 nystagmus beats. We ascribe the increase of the post-OKN with CLI and the existence of a threshold constant light interval to activity-accumulation in the common velocity-to-position integrator (velocity storage) of the brain stem. Loading of the velocity storage takes place after the OKN gain has already reached the steady-state value. Apparent movement OKN could also be elicited in guinea pigs that lack an effective smooth pursuit system. We suggest that apparent movement OKN is produced by mechanisms located in the brain stem.


1974 ◽  
Vol 12 (2) ◽  
pp. 6-8

Techniques are now available for estimating the plasma concentration of several drugs used in psychiatry. These techniques are clearly important for research but they can hardly be expected to improve the clinical management of patients unless the estimation is sensitive, reliable and reasonably quick; the method should be specific for the particular drug but should also specifically estimate any active metabolites. Even when reliable figures have been obtained, much more information is needed before they can be interpreted. The relationship between plasma (or plasma water) concentration and relevant tissue concentration (e. g. in the brain) must be known. Plasma samples should be taken at appropriate times, e. g. after the attainment of ‘steady-state’ conditions: plasma and tissue levels will then be in equilibrium. Diagnoses must be soundly based if inferences are to be drawn. Reliable methods of assessing clinical response must be available. These requirements pose difficult problems in psychiatry.


Author(s):  
Subrata Dasgupta

At first blush, computing and biology seem an odd couple, yet they formed a liaison of sorts from the very first years of the electronic digital computer. Following a seminal paper published in 1943 by neurophysiologist Warren McCulloch and mathematical logician Warren Pitts on a mathematical model of neuronal activity, John von Neumann of the Institute of Advanced Study, Princeton, presented at a symposium in 1948 a paper that compared the behaviors of computer circuits and neuronal circuits in the brain. The resulting publication was the fountainhead of what came to be called cellular automata in the 1960s. Von Neumann’s insight was the parallel between the abstraction of biological neurons (nerve cells) as natural binary (on–off) switches and the abstraction of physical computer circuit elements (at the time, relays and vacuum tubes) as artificial binary switches. His ambition was to unify the two and construct a formal universal theory. One remarkable aspect of von Neumann’s program was inspired by the biology: His universal automata must be able to self-reproduce. So his neuron-like automata must be both computational and constructive. In 1955, invited by Yale University to deliver the Silliman Lectures for 1956, von Neumann chose as his topic the relationship between the computer and the brain. He died before being able to deliver the lectures, but the unfinished manuscript was published by Yale University Press under the title The Computer and the Brain (1958). Von Neumann’s definitive writings on self-reproducing cellular automata, edited by his one-time collaborator Arthur Burks of the University of Michigan, was eventually published in 1966 as the book Theory of Self-Reproducing Automata. A possible structure of a von Neumann–style cellular automaton is depicted in Figure 7.1. It comprises a (finite or infinite) configuration of cells in which a cell can be in one of a finite set of states. The state of a cell at any time t is determined by its own state and those of its immediate neighbors in the preceding point of time t – 1, according to a state transition rule.


Sign in / Sign up

Export Citation Format

Share Document