Bioscience Computing and the Role of Computational Simulation in Biology

Author(s):  
Christopher D. Clack
2021 ◽  
Author(s):  
Patrick McNamara ◽  
Wesley J Wildman ◽  
George Hodulik ◽  
David Rohr

Abstract Study Objectives To test and extend Levin & Nielsen’s (2007) Affective Network Dysfunction (AND) model with nightmare disorder (ND) image characteristics, and then to implement the extension as a computational simulation, the Disturbed Dreaming Model (DDM). Methods We used AnyLogic V7.2 to computationally implement an extended AND model incorporating quantitative effects of image characteristics including valence, dominance, and arousal. We explored the DDM parameter space by varying parameters, running approximately one million runs, each for one month of model time, varying pathway bifurcation thresholds, image characteristics, and individual-difference variables to quantitively evaluate their combinatory effects on nightmare symptomology. Results The DDM shows that the AND model extended with pathway bifurcations and image properties is computationally coherent. Varying levels of image properties we found that when nightmare images exhibit lower dominance and arousal levels, the ND agent will choose to sleep but then has a traumatic nightmare, whereas, when images exhibit greater than average dominance and arousal levels, the nightmares trigger sleep-avoidant behavior, but lower overall nightmare distress at the price of exacerbating nightmare effects during waking hours. Conclusions Computational simulation of nightmare symptomology within the AND framework suggests that nightmare image properties significantly influence nightmare symptomology. Computational models for sleep and dream studies are powerful tools for testing quantitative effects of variables affecting nightmare symptomology and confirms the value of extending the Levin & Nielsen AND model of disturbed dreaming/ND.


2019 ◽  
Vol 8 ◽  
pp. 1-26
Author(s):  
Eli Nomes ◽  
André Grow ◽  
Jan Van Bavel

Around the middle of the 20th century, most Western countries experienced a surge in birth rates, called the Baby Boom. This boom was unexpected at the time and the underlying mechanisms are still not entirely clear. It was characterized by high levels of inter- and intra-country variability in fertility, as some regions even experienced fertility decline during the Boom. In this paper, we suggest that social influence processes, propelling a shift towards two-child families, might have played an important role in the observed changes in fertility. Interactions in social networks can lead new types of childbearing behaviour to diffuse widely and thereby induce changes in fertility at the macro level. The emergence and diffusion of a two-child norm resulted in homogenization of fertility behaviour across regions. Overall, this led to a reduction of childlessness and thus an increase of fertility, as more people aspired to have at least two children. Yet, in those regions where larger family sizes were still common, the two-child norm contributed to a fertility decline. To explore the role of social influence with analytical rigor, we make use of agent-based computational modelling. We explicate the underlying behavioural assumptions in a formal model and assess their implications by submitting this model to computational simulation experiments. We use Belgium as a case study, since it exhibited large variability in fertility in a relatively small population during the Baby Boom years. We use census data to generate realistic starting conditions and to empirically validate the outcomes that our model generates. Our results show that the proposed mechanism could explain an important part of the variability of fertility trends during the Baby Boom era.


Author(s):  
Mary Spyropoulos ◽  
Alisa Andrasek

AbstractThis paper examines the role of computational simulation of material processes with robotics fabrication, with the intent of examining its implications for architectural design and construction. Simulation techniques have been adopted in the automotive industry amongst others, advancing their design and manufacturing outputs. At present, architecture is yet to explore the full potential of this technology and their techniques. The need for simulation is evident in exploring the behaviours of materials and their relative properties. Currently, there is a distinct disconnect between the virtual model and its fabricated counterpart. Through research in simulation, we can begin to understand and clearly visualize the relationship between material behaviours and properties that can lead to a closer correlation between the digital design and its fabricated outcome. As the first phase of investigation, the material of clay is used due to its volatile qualities embedded within the material behaviour. The input geometry is constrained to rudimentary extruded forms in order to not obscure the behaviour of the material, but rather allow for it to drive the machine-making process.


2016 ◽  
Vol 58 (2) ◽  
Author(s):  
Juan A. Barceló

AbstractAs practitioners of an historical discipline, archaeologists are in general, interested in guessing how ancient objects, buildings and/or landscapes were produced and used in the past from the object's visual appearance and material properties. In the pursuit of this general goal, archaeologists act as detectives looking for material cues (mostly visual) that may allow the discovering of social actions that may have happened in the past. This investigation can be made intuitively, but also formally, in which case we need computational tools and techniques to 1) extract necessary information from visual and non-visual data, 2) data processing to evaluate relevant relationships between different items at different spatial and temporal scales, 3) build appropriate models that can be used to provide explanatory hypothesis about the human past. In this paper I review the nature of archaeological problems and suggest computational techniques and methods that can be useful in solving these kinds of questions. Emphasis is made on modern classification and clustering techniques (neural networks, for instance), and computational simulation approaches.


2017 ◽  
Vol 55 (4) ◽  
pp. 730-744 ◽  
Author(s):  
Joohyun Kim ◽  
Ohsung Kwon ◽  
Duk Hee Lee

Purpose The purpose of this paper is to explore how hubs’ social influence on social network decisions can cause the behavior of information cascades in a market. Design/methodology/approach The authors establish understanding of the fundamental mechanism of information cascades through a computational simulation approach. Findings Eigenvector centrality, betweenness centrality, and PageRank are statistically correlated with the occurrence of information cascades among agents; the hubs’ incorrect decisions in the early diffusion stage can significantly cause misled shift cascades; and the bridge role of hubs is more influential than their pivotal position role in the process of misled shift cascades. Originality/value This implication can be extendable in the field of marketing, sequential voting, and technology, or innovation adoption.


1990 ◽  
Vol 5 (5) ◽  
pp. 998-1002 ◽  
Author(s):  
J.A. Knapp ◽  
L.R. Thompson ◽  
G.J. Collins

Under circumstances in Zone-Melt-Recrystallization (ZMR) of Si-on-Insulator (SOI) structures where radiative heat loss is significant, the ∼50% decrease in emissivity when Si melts destabilizes the Si molten zone. We have demonstrated this both experimentally using a slowly scanned e-beam line source and numerically with a finite-element computational simulation. The resulting instability narrows the process window and tightens requirements on beam control and background heating uniformity, both for e-beam ZMR systems and optically-coupled systems such as a graphite strip heater.


2019 ◽  
Author(s):  
Yao Yao ◽  
Lorenzo Carretero-Paulet ◽  
Yves Van de Peer

AbstractThe potential role of whole genome duplication (WGD) in evolution is controversial. Whereas some view WGD mainly as detrimental and an evolutionary ‘dead end’, there is growing evidence that the long-term establishment of polyploidy might be linked to environmental change, stressful conditions, or periods of extinction. However, despite much research, the mechanistic underpinnings of why and how polyploids might be able to outcompete non-polyploids at times of environmental upheaval remain indefinable. Here, we improved our recently developed bio-inspired framework, combining an artificial genome with an agent-based system, to form a population of so-called Digital Organisms (DOs), to examine the impact of WGD on evolution under different environmental scenarios mimicking extinction events of varying strength and frequency. We found that, under stable environments, DOs with non-duplicated genomes formed the majority, if not all, of the population, whereas the numbers of DOs with duplicated genomes increased under dramatically challenging environments. After tracking the evolutionary trajectories of individual artificial genomes in terms of sequence and encoded gene regulatory networks (GRNs), we propose that increased complexity, modularity, and redundancy of duplicated GRNs might provide DOs with increased adaptive potential under extinction events, while ensuring mutational robustness of the whole GRN. Our results confirm the usefulness of our computational simulation in studying the role of WGD in evolution and adaptation, helping to overcome the traditional limitations of evolution experiments with model organisms, and provide some additional insights into how genome duplication might help organisms to compete for novel niches and survive ecological turmoil.


Sign in / Sign up

Export Citation Format

Share Document