scholarly journals Predicting Synaptic Connectivity for Large-Scale Microcircuit Simulations Using Snudda

2021 ◽  
Author(s):  
J. J. Johannes Hjorth ◽  
Jeanette Hellgren Kotaleski ◽  
Alexander Kozlov

AbstractSimulation of large-scale networks of neurons is an important approach to understanding and interpreting experimental data from healthy and diseased brains. Owing to the rapid development of simulation software and the accumulation of quantitative data of different neuronal types, it is possible to predict both computational and dynamical properties of local microcircuits in a ‘bottom-up’ manner. Simulated data from these models can be compared with experiments and ‘top-down’ modelling approaches, successively bridging the scales. Here we describe an open source pipeline, using the software Snudda, for predicting microcircuit connectivity and for setting up simulations using the NEURON simulation environment in a reproducible way. We also illustrate how to further ‘curate’ data on single neuron morphologies acquired from public databases. This model building pipeline was used to set up a first version of a full-scale cellular level model of mouse dorsal striatum. Model components from that work are here used to illustrate the different steps that are needed when modelling subcortical nuclei, such as the basal ganglia.

2021 ◽  
Author(s):  
J J Johannes Hjorth ◽  
Jeanette Hellgren Kotaleski ◽  
Alexander Kozlov

AbstractSimulation of large-scale networks of neurons is an important approach to understanding and interpreting experimental data from healthy and diseased brains. Owing to the rapid development of simulation software and the accumulation of quantitative data of different neuronal types, it is possible to predict both computational and dynamical properties of local microcircuits in a ‘bottom-up’ manner. Simulated data from these models can be compared with experiments and ‘top-down’ modelling approaches, successively bridging the scales. Here we describe an open source pipeline, using the software Snudda, for predicting microcircuit connectivity and for setting up simulations using the NEURON simulation environment in a reproducible way. We also illustrate how to further ‘curate’ data on single neuron morphologies acquired from public databases. This model building pipeline was used to set up a first version of a full-scale cellular level model of mouse dorsal striatum. Model components from that work are here used to illustrate the different steps that are needed when modelling subcortical nuclei, such as the basal ganglia.


2018 ◽  
Vol 170 ◽  
pp. 08003
Author(s):  
L. Berge ◽  
N. Estre ◽  
D. Tisseur ◽  
E. Payan ◽  
D. Eck ◽  
...  

The future PLINIUS-2 platform of CEA Cadarache will be dedicated to the study of corium interactions in severe nuclear accidents, and will host innovative large-scale experiments. The Nuclear Measurement Laboratory of CEA Cadarache is in charge of real-time high-energy X-ray imaging set-ups, for the study of the corium-water and corium-sodium interaction, and of the corium stratification process. Imaging such large and high-density objects requires a 15 MeV linear electron accelerator coupled to a tungsten target creating a high-energy Bremsstrahlung X-ray flux, with corresponding dose rate about 100 Gy/min at 1 m. The signal is detected by phosphor screens coupled to high-framerate scientific CMOS cameras. The imaging set-up is established using an experimentally-validated home-made simulation software (MODHERATO). The code computes quantitative radiographic signals from the description of the source, object geometry and composition, detector, and geometrical configuration (magnification factor, etc.). It accounts for several noise sources (photonic and electronic noises, swank and readout noise), and for image blur due to the source spot-size and to the detector unsharpness. In a view to PLINIUS-2, the simulation has been improved to account for the scattered flux, which is expected to be significant. The paper presents the scattered flux calculation using the MCNP transport code, and its integration into the MODHERATO simulation. Then the validation of the improved simulation is presented, through confrontation to real measurement images taken on a small-scale equivalent set-up on the PLINIUS platform. Excellent agreement is achieved. This improved simulation is therefore being used to design the PLINIUS-2 imaging set-ups (source, detectors, cameras, etc.).


Agriculture ◽  
2020 ◽  
Vol 10 (12) ◽  
pp. 614
Author(s):  
Jiuliang Xu ◽  
Zhihua Zhang ◽  
Xian Zhang ◽  
Muhammad Ishfaq ◽  
Jiahui Zhong ◽  
...  

China feeds approximately 22% of the global population with only 7% of the global arable land because of its surprising success in intensive agriculture. This outstanding achievement is partially overshadowed by agriculture-related large-scale environmental pollution across the nation. To ensure nutrition security and environmental sustainability, China proposed the Green Food Strategy in the 1990s and set up a specialized management agency, the China Green Food Development Center, with a monitoring network for policy and standard creation, brand authorization, and product inspection. Following these 140 environmental and operational standards, 15,984 green food companies provided 36,345 kinds of products in 2019. The cultivation area and annual domestic sales (CNY 465.7 billion) of green food accounted for 8.2% of the total farmland area and 9.7% of the gross domestic product (GDP) from agriculture in China. Herein, we systemically reviewed the regulation, standards, and authorization system of green food and its current advances in China, and then outlined its environmental benefits, challenges, and probable strategies for future optimization and upscaling. The rapid development of the green food industry in China suggests an applicable triple-win strategy for protecting the environment, promoting agroeconomic development, and improving human nutrition and health in other developing countries or regions.


Entropy ◽  
2020 ◽  
Vol 22 (12) ◽  
pp. 1383
Author(s):  
Jinfang Sheng ◽  
Cheng Liu ◽  
Long Chen ◽  
Bin Wang ◽  
Junkai Zhang

With the rapid development of computer technology, the research on complex networks has attracted more and more attention. At present, the research directions of cloud computing, big data, internet of vehicles, and distributed systems with very high attention are all based on complex networks. Community structure detection is a very important and meaningful research hotspot in complex networks. It is a difficult task to quickly and accurately divide the community structure and run it on large-scale networks. In this paper, we put forward a new community detection approach based on internode attraction, named IACD. This algorithm starts from the perspective of the important nodes of the complex network and refers to the gravitational relationship between two objects in physics to represent the forces between nodes in the network dataset, and then perform community detection. Through experiments on a large number of real-world datasets and synthetic networks, it is shown that the IACD algorithm can quickly and accurately divide the community structure, and it is superior to some classic algorithms and recently proposed algorithms.


2020 ◽  
Vol 8 (12) ◽  
pp. 1889
Author(s):  
Annie Vera Hunnestad ◽  
Anne Ilse Maria Vogel ◽  
Evelyn Armstrong ◽  
Maria Guadalupe Digernes ◽  
Murat Van Ardelan ◽  
...  

Iron is an essential, yet scarce, nutrient in marine environments. Phytoplankton, and especially cyanobacteria, have developed a wide range of mechanisms to acquire iron and maintain their iron-rich photosynthetic machinery. Iron limitation studies often utilize either oceanographic methods to understand large scale processes, or laboratory-based, molecular experiments to identify underlying molecular mechanisms on a cellular level. Here, we aim to highlight the benefits of both approaches to encourage interdisciplinary understanding of the effects of iron limitation on cyanobacteria with a focus on avoiding pitfalls in the initial phases of collaboration. In particular, we discuss the use of trace metal clean methods in combination with sterile techniques, and the challenges faced when a new collaboration is set up to combine interdisciplinary techniques. Methods necessary for producing reliable data, such as High Resolution Inductively Coupled Plasma Mass Spectrometry (HR-ICP-MS), Flow Injection Analysis Chemiluminescence (FIA-CL), and 77K fluorescence emission spectroscopy are discussed and evaluated and a technical manual, including the preparation of the artificial seawater medium Aquil, cleaning procedures, and a sampling scheme for an iron limitation experiment is included. This paper provides a reference point for researchers to implement different techniques into interdisciplinary iron studies that span cyanobacteria physiology, molecular biology, and biogeochemistry.


PLoS ONE ◽  
2020 ◽  
Vol 15 (12) ◽  
pp. e0244820
Author(s):  
Thibault Chabin ◽  
Damien Gabriel ◽  
Emmanuel Haffen ◽  
Thierry Moulin ◽  
Lionel Pazart

Since the beginning of the 20th century, electroencephalography (EEG) has been used in a wide variety of applications, both for medical needs and for the study of various cerebral processes. With the rapid development of the technique, more and more precise and advanced tools have emerged for research purposes. However, the main constraints of these devices have often been the high price and, for some devices the low transportability and the long set-up time. Nevertheless, a broad range of wireless EEG devices have emerged on the market without these constraints, but with a lower signal quality. The development of EEG recording on multiple participants simultaneously, and new technological solutions provides further possibilities to understand the cerebral emotional dynamics of a group. A great number of studies have compared and tested many mobile devices, but have provided contradictory results. It is therefore important to test the reliability of specific wireless devices in a specific research context before developing a large-scale study. The aim of this study was to assess the reliability of two wireless devices (g.tech Nautilus SAHARA electrodes and Emotiv™ Epoc +) for the detection of musical emotions, in comparison with a gold standard EEG device. Sixteen participants reported feeling emotional pleasure (from low pleasure up to musical chills) when listening to their favorite chill-inducing musical excerpts. In terms of emotion detection, our results show statistically significant concordance between Epoc + and the gold standard device in the left prefrontal and left temporal areas in the alpha frequency band. We validated the use of the Emotiv™ Epoc + for research into musical emotion. We did not find any significant concordance between g.tech and the gold standard. This suggests that Emotiv Epoc is more appropriate for musical emotion investigations in natural settings.


2014 ◽  
Vol 926-930 ◽  
pp. 1361-1364
Author(s):  
Ye Tian ◽  
Li Chen Gu

With the direction of large-scale, heavy machinery field, hydraulic system performance requirements also more and more high; This article is based on the mechanical and electrical integration of liquid hydraulic system test platform as the background, by the variable pump control motor speed control system as the research object, and describes the control principle of the system, established the mathematical model of the system, on the simulation software AMESim corresponding dynamic simulation model is set up, by changing the input variable pump rotation speed and system load test, and the output of the motor speed response has carried on the simulation, the simulation result in accordance with the expected on the trend of basic, therefore, the pump control motor speed control system between speed and power as a power source and load transfer mechanism is feasible.


2022 ◽  
Vol 19 (3) ◽  
pp. 2700-2719
Author(s):  
Siyuan Yin ◽  
◽  
Yanmei Hu ◽  
Yuchun Ren

<abstract> <p>Many systems in real world can be represented as network, and network analysis can help us understand these systems. Node centrality is an important problem and has attracted a lot of attention in the field of network analysis. As the rapid development of information technology, the scale of network data is rapidly increasing. However, node centrality computation in large-scale networks is time consuming. Parallel computing is an alternative to speed up the computation of node centrality. GPU, which has been a core component of modern computer, can make a large number of core tasks work in parallel and has the ability of big data processing, and has been widely used to accelerate computing. Therefore, according to the parallel characteristic of GPU, we design the parallel algorithms to compute three widely used node centralities, i.e., closeness centrality, betweenness centrality and PageRank centrality. Firstly, we classify the three node centralities into two groups according to their definitions; secondly, we design the parallel algorithms by mapping the centrality computation of different nodes into different blocks or threads in GPU; thirdly, we analyze the correlations between different centralities in several networks, benefited from the designed parallel algorithms. Experimental results show that the parallel algorithms designed in this paper can speed up the computation of node centrality in large-scale networks, and the closeness centrality and the betweenness centrality are weakly correlated, although both of them are based on the shortest path.</p> </abstract>


Author(s):  
Stéphane Coulondre

Nowadays, terrorists master technology. They often use electronic devices that allow them to act without being physically exposed. As a consequence, their attacks are quicker, more precise, and even more disastrous. As cyber- terrorism relies on computers, the evidence is distributed on large-scale networks. Internet providers as well as government agencies around the world have set up several advanced logging techniques. However, this kind of information alone is not always sufficient. It is sometimes paramount to also analyse the target and source computers, if available, as well as some networking elements. This step is called cyber-forensics, and allows for precisely reconstructing and understanding the attack, and sometimes for identifying the intruders. In this paper, we present the basics and well-known issues, and we give some related perspectives.


2020 ◽  
Author(s):  
Ian Bennett ◽  
Philip L. Bulterys ◽  
Melody Chang ◽  
Joseph M. DeSimone ◽  
Jennifer Fralick ◽  
...  

ABSTRACTThe novel coronavirus disease (COVID-19) has caused a pandemic that has disrupted supply chains globally. This black swan event is challenging industries from all sectors of the economy including those industries directly needed to produce items that safeguard us from the disease itself, especially personal protection equipment (N95 masks, face shields) and much needed consumables associated with testing and vaccine delivery (swabs, vials and viral transfer medium). Digital manufacturing, especially 3D printing, has been promulgated as an important approach for the rapid development of new products as well as a replacement manufacturing technique for many traditional manufacturing methods, including injection molding, when supply chains are disrupted. Herein we report the use of Digital Light Synthesis (DLS) for the design and large-scale deployment of nasopharyngeal (NP) swabs for testing of coronavirus SARS-CoV-2 infections in humans. NP swabs have been one of society’s essential products hardest hit by the supply chain disruptions caused by COVID-19. A latticed tip NP swab was designed and fabricated by DLS from a liquid resin previously developed and approved for use to make dental night guard devices. These latticed NP swabs demonstrated non-inferiority in a human clinical study of patients suspected of being infected with SARS-CoV-2.


Sign in / Sign up

Export Citation Format

Share Document