Context Dependent Information Processing Entails Scale-Free Dynamics

SmartData ◽  
2013 ◽  
pp. 39-45
Author(s):  
Donald Borrett
2020 ◽  
Vol 17 (163) ◽  
pp. 20190845
Author(s):  
Pablo Villegas ◽  
Miguel A. Muñoz ◽  
Juan A. Bonachela

Biological networks exhibit intricate architectures deemed to be crucial for their functionality. In particular, gene regulatory networks, which play a key role in information processing in the cell, display non-trivial architectural features such as scale-free degree distributions, high modularity and low average distance between connected genes. Such networks result from complex evolutionary and adaptive processes difficult to track down empirically. On the other hand, there exists detailed information on the developmental (or evolutionary) stages of open-software networks that result from self-organized growth across versions. Here, we study the evolution of the Debian GNU/Linux software network, focusing on the changes of key structural and statistical features over time. Our results show that evolution has led to a network structure in which the out-degree distribution is scale-free and the in-degree distribution is a stretched exponential. In addition, while modularity, directionality of information flow, and average distance between elements grew, vulnerability decreased over time. These features resemble closely those currently shown by gene regulatory networks, suggesting the existence of common adaptive pathways for the architectural design of information-processing networks. Differences in other hierarchical aspects point to system-specific solutions to similar evolutionary challenges.


Author(s):  
Sara Imari Walker ◽  
Hyunju Kim ◽  
Paul C. W. Davies

We compare the informational architecture of biological and random networks to identify informational features that may distinguish biological networks from random. The study presented here focuses on the Boolean network model for regulation of the cell cycle of the fission yeast Schizosaccharomyces pombe . We compare calculated values of local and global information measures for the fission yeast cell cycle to the same measures as applied to two different classes of random networks: Erdös–Rényi and scale-free. We report patterns in local information processing and storage that do indeed distinguish biological from random, associated with control nodes that regulate the function of the fission yeast cell-cycle network. Conversely, we find that integrated information, which serves as a global measure of ‘emergent’ information processing, does not differ from random for the case presented. We discuss implications for our understanding of the informational architecture of the fission yeast cell-cycle network in particular, and more generally for illuminating any distinctive physics that may be operative in life.


2000 ◽  
Vol 45 (1-2) ◽  
pp. 93-101 ◽  
Author(s):  
Nadine Bazin ◽  
Pierre Perruchet ◽  
Marie Christine Hardy-Bayle ◽  
André Feline

2021 ◽  
Author(s):  
Arthur-Ervin Avramiea ◽  
Anas Masood ◽  
Huibert D Mansvelder ◽  
Klaus Linkenkaer-Hansen

Brain function depends on segregation and integration of information processing in brain networks often separated by long-range anatomical connections. Neuronal oscillations orchestrate such distributed processing through transient amplitude and phase coupling; however, little is known about local network properties facilitating these functional connections. Here, we test whether criticality—a dynamical state characterized by scale-free oscillations—optimizes the capacity of neuronal networks to couple through amplitude or phase, and transfer information. We coupled in silico networks with varying excitatory and inhibitory connectivity, and found that phase coupling emerges at criticality, and that amplitude coupling, as well as information transfer, are maximal when networks are critical. Our data support the idea that criticality is important for local and global information processing and may help explain why brain disorders characterized by local alterations in criticality also exhibit impaired long-range synchrony, even prior to degeneration of physical connections.


2017 ◽  
Vol 16 (04) ◽  
pp. 1750031 ◽  
Author(s):  
Huijuan Xie ◽  
Yubing Gong

In this paper, we study the effect of channel noise on the temporal coherence of scale-free Hodgkin–Huxley neuronal networks with time delay. It is found that the temporal coherence of the neuronal networks changes as channel noise intensity is varied in different ways depending on the range of channel noise intensity. The temporal coherence monotonically decreases with the increase of channel noise intensity for too small or too big channel noise intensity. However, for intermediate channel noise intensity it intermittently and rapidly becomes high and low as channel noise intensity is varied, exhibiting temporal coherence transitions. Moreover, this phenomenon is dependent on coupling strength and network average degree and becomes strongest when they are optimal. This result shows that channel noise has a regulation effect on the temporal coherence of the delayed neuronal networks by inducing temporal coherence transitions. This provides a new insight into channel noise for the information processing and transmission in neural systems.


2008 ◽  
Vol 31 (1) ◽  
pp. 38-39 ◽  
Author(s):  
Iris van Rooij ◽  
Willem Haselager ◽  
Harold Bekkering

AbstractPeople cannot understand intentions behind observed actions by direct simulation, because goal inference is highly context dependent. Context dependency is a major source of computational intractability in traditional information-processing models. An embodied embedded view of cognition may be able to overcome this problem, but then the problem needs recognition and explication within the context of the new, layered cognitive architecture.


2016 ◽  
Vol 39 ◽  
Author(s):  
Giosuè Baggio ◽  
Carmelo M. Vicario

AbstractWe agree with Christiansen & Chater (C&C) that language processing and acquisition are tightly constrained by the limits of sensory and memory systems. However, the human brain supports a range of cognitive functions that mitigate the effects of information processing bottlenecks. The language system is partly organised around these moderating factors, not just around restrictions on storage and computation.


2020 ◽  
Vol 43 ◽  
Author(s):  
Chris Fields ◽  
James F. Glazebrook

Abstract Gilead et al. propose an ontology of abstract representations based on folk-psychological conceptions of cognitive architecture. There is, however, no evidence that the experience of cognition reveals the architecture of cognition. Scale-free architectural models propose that cognition has the same computational architecture from sub-cellular to whole-organism scales. This scale-free architecture supports representations with diverse functions and levels of abstraction.


Sign in / Sign up

Export Citation Format

Share Document