scholarly journals Synthetic neural-like computing in microbial consortia for pattern recognition

2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Ximing Li ◽  
Luna Rizik ◽  
Valeriia Kravchik ◽  
Maria Khoury ◽  
Netanel Korin ◽  
...  

AbstractComplex biological systems in nature comprise cells that act collectively to solve sophisticated tasks. Synthetic biological systems, in contrast, are designed for specific tasks, following computational principles including logic gates and analog design. Yet such approaches cannot be easily adapted for multiple tasks in biological contexts. Alternatively, artificial neural networks, comprised of flexible interactions for computation, support adaptive designs and are adopted for diverse applications. Here, motivated by the structural similarity between artificial neural networks and cellular networks, we implement neural-like computing in bacteria consortia for recognizing patterns. Specifically, receiver bacteria collectively interact with sender bacteria for decision-making through quorum sensing. Input patterns formed by chemical inducers activate senders to produce signaling molecules at varying levels. These levels, which act as weights, are programmed by tuning the sender promoter strength Furthermore, a gradient descent based algorithm that enables weights optimization was developed. Weights were experimentally examined for recognizing 3 × 3-bit pattern.

2020 ◽  
Author(s):  
Ximing Li ◽  
Luna Rizik ◽  
Ramez Daniel

Abstract Complex biological systems in nature comprise of cells that act collectively to solve sophisticated tasks. Synthetic biological systems, in contrast, are designed for specific tasks, largely following computational principles including logic gates, analog design, and control theory. Yet such approaches cannot be easily adapted for multiple tasks in biological contexts. Alternatively, artificial neural networks (ANN), comprised of flexible interactions for processing and decision-making, are widely adopted for numerous applications and support adaptive designs. Motivated by the structural similarity between ANNs and cellular networks, here we implemented ANN-like computing in bacteria consortia for recognizing patterns. In cellular ANNs, receiver bacteria collectively interact through quorum sensing (QS) with sender bacteria for decision-making processes. Input patterns formed by chemical inducers, activate sender circuits to produce QS signaling molecules with varying levels. These levels are programmed by tuning the promoter strength acting as weights. We also developed an algorithm based on gradient descent, which is well-accepted in artificial intelligence, to optimize weights and experimentally examined them using 3x3-bit patterns.


2021 ◽  
Author(s):  
Ruthvik Vaila

Spiking neural networks are biologically plausible counterparts of artificial neural networks. Artificial neural networks are usually trained with stochastic gradient descent (SGD) and spiking neural networks are trained with bioinspired spike timing dependent plasticity (STDP). Spiking networks could potentially help in reducing power usage owing to their binary activations. In this work, we use unsupervised STDP in the feature extraction layers of a neural network with instantaneous neurons to extract meaningful features. The extracted binary feature vectors are then classified using classification layers containing neurons with binary activations. Gradient descent (backpropagation) is used only on the output layer to perform training for classification. Surrogate gradients are proposed to perform backpropagation with binary gradients. The accuracies obtained for MNIST and the balanced EMNIST data set compare favorably with other approaches. The effect of the stochastic gradient descent (SGD) approximations on learning capabilities of our network are also explored. We also studied catastrophic forgetting and its effect on spiking neural networks (SNNs). For the experiments regarding catastrophic forgetting, in the classification sections of the network we use a modified synaptic intelligence that we refer to as cost per synapse metric as a regularizer to immunize the network against catastrophic forgetting in a Single-Incremental-Task scenario (SIT). In catastrophic forgetting experiments, we use MNIST and EMNIST handwritten digits datasets that were divided into five and ten incremental subtasks respectively. We also examine behavior of the spiking neural network and empirically study the effect of various hyperparameters on its learning capabilities using the software tool SPYKEFLOW that we developed. We employ MNIST, EMNIST and NMNIST data sets to produce our results.


2021 ◽  
Vol 7 ◽  
pp. e429
Author(s):  
Yuri Antonacci ◽  
Ludovico Minati ◽  
Luca Faes ◽  
Riccardo Pernice ◽  
Giandomenico Nollo ◽  
...  

One of the most challenging problems in the study of complex dynamical systems is to find the statistical interdependencies among the system components. Granger causality (GC) represents one of the most employed approaches, based on modeling the system dynamics with a linear vector autoregressive (VAR) model and on evaluating the information flow between two processes in terms of prediction error variances. In its most advanced setting, GC analysis is performed through a state-space (SS) representation of the VAR model that allows to compute both conditional and unconditional forms of GC by solving only one regression problem. While this problem is typically solved through Ordinary Least Square (OLS) estimation, a viable alternative is to use Artificial Neural Networks (ANNs) implemented in a simple structure with one input and one output layer and trained in a way such that the weights matrix corresponds to the matrix of VAR parameters. In this work, we introduce an ANN combined with SS models for the computation of GC. The ANN is trained through the Stochastic Gradient Descent L1 (SGD-L1) algorithm, and a cumulative penalty inspired from penalized regression is applied to the network weights to encourage sparsity. Simulating networks of coupled Gaussian systems, we show how the combination of ANNs and SGD-L1 allows to mitigate the strong reduction in accuracy of OLS identification in settings of low ratio between number of time series points and of VAR parameters. We also report how the performances in GC estimation are influenced by the number of iterations of gradient descent and by the learning rate used for training the ANN. We recommend using some specific combinations for these parameters to optimize the performance of GC estimation. Then, the performances of ANN and OLS are compared in terms of GC magnitude and statistical significance to highlight the potential of the new approach to reconstruct causal coupling strength and network topology even in challenging conditions of data paucity. The results highlight the importance of of a proper selection of regularization parameter which determines the degree of sparsity in the estimated network. Furthermore, we apply the two approaches to real data scenarios, to study the physiological network of brain and peripheral interactions in humans under different conditions of rest and mental stress, and the effects of the newly emerged concept of remote synchronization on the information exchanged in a ring of electronic oscillators. The results highlight how ANNs provide a mesoscopic description of the information exchanged in networks of multiple interacting physiological systems, preserving the most active causal interactions between cardiovascular, respiratory and brain systems. Moreover, ANNs can reconstruct the flow of directed information in a ring of oscillators whose statistical properties can be related to those of physiological networks.


Author(s):  
Santosh Giri ◽  
Basanta Joshi

ANN is a computational model that is composed of several processing elements (neurons) that tries to solve a specific problem. Like the human brain, it provides the ability to learn from experiences without being explicitly programmed. This article is based on the implementation of artificial neural networks for logic gates. At first, the 3 layers Artificial Neural Network is designed with 2 input neurons, 2 hidden neurons & 1 output neuron. after that model is trained by using a backpropagation algorithm until the model satisfies the predefined error criteria (e) which set 0.01 in this experiment. The learning rate (α) used for this experiment was 0.01. The NN model produces correct output at iteration (p)= 20000 for AND, NAND & NOR gate. For OR & XOR the correct output is predicted at iteration (p)=15000 & 80000 respectively.


Entropy ◽  
2020 ◽  
Vol 23 (1) ◽  
pp. 35
Author(s):  
Oleg Kuzenkov ◽  
Andrew Morozov ◽  
Galina Kuzenkova

Here, we propose a computational approach to explore evolutionary fitness in complex biological systems based on empirical data using artificial neural networks. The essence of our approach is the following. We first introduce a ranking order of inherited elements (behavioral strategies or/and life history traits) in considered self-reproducing systems: we use available empirical information on selective advantages of such elements. Next, we introduce evolutionary fitness, which is formally described as a certain function reflecting the introduced ranking order. Then, we approximate fitness in the space of key parameters using a Taylor expansion. To estimate the coefficients in the Taylor expansion, we utilize artificial neural networks: we construct a surface to separate the domains of superior and interior ranking of pair inherited elements in the space of parameters. Finally, we use the obtained approximation of the fitness surface to find the evolutionarily stable (optimal) strategy which maximizes fitness. As an ecologically important study case, we apply our approach to explore the evolutionarily stable diel vertical migration of zooplankton in marine and freshwater ecosystems. Using machine learning we reconstruct the fitness function of herbivorous zooplankton from empirical data and predict the daily trajectory of a dominant species in the northeastern Black Sea.


Sign in / Sign up

Export Citation Format

Share Document