causal states
Recently Published Documents


TOTAL DOCUMENTS

10
(FIVE YEARS 2)

H-INDEX

4
(FIVE YEARS 1)

2020 ◽  
Vol 62 (4) ◽  
pp. 111-126
Author(s):  
Philip Clapson

AbstractIt may seem obvious we are conscious for we are certain we see, feel and think, but there is no accepted scientific account of these mental states as a brain condition. And since most neuroscientists assume consciousness and its supposed powers without explaining it, science is brought into question. That consciousness does not exist is here explained. The alternative, the theory of brain-sign, is outlined. It eliminates the quasi-divine knowledge properties of seeing, feeling and thinking. Brain-sign is a means/mechanism enabling collective action between organisms. Brain-sign signifies the shared world of that action. Signs are intrinsically physical and biologically ubiquitous. Brain-signs are derived moment-by-moment from the causal orientation of each brain towards others and the world. Interactive behaviour which is not predetermined (as in passing a cup of coffee) is characteristic of vertebrate species. Causality lies in the electrochemical operation of the brain. But identifying the changing world by brain-signs binds the causal states of those interacting into one unified operation. Brain-signing creatures, including humans, have no ‘sense’ they function this way. The world appears as seen. The ‘sense of seeing’, however, is the brain’s communicative activity in joint behaviour. Similarly for ‘feeling’. Language causality results from the transmission of compression waves or electromagnetic radiation from one brain to another altering the other’s causal orientation. The ‘sense of understanding’ words is the communicative state. The brain understands nothing, knows nothing, believes nothing. By replacing the prescientific notion of consciousness, brain-sign can enable a scientific path for brain science.


Entropy ◽  
2019 ◽  
Vol 21 (7) ◽  
pp. 640 ◽  
Author(s):  
Michael Hahn ◽  
Richard Futrell

The Predictive Rate–Distortion curve quantifies the trade-off between compressing information about the past of a stochastic process and predicting its future accurately. Existing estimation methods for this curve work by clustering finite sequences of observations or by utilizing analytically known causal states. Neither type of approach scales to processes such as natural languages, which have large alphabets and long dependencies, and where the causal states are not known analytically. We describe Neural Predictive Rate–Distortion (NPRD), an estimation method that scales to such processes, leveraging the universal approximation capabilities of neural networks. Taking only time series data as input, the method computes a variational bound on the Predictive Rate–Distortion curve. We validate the method on processes where Predictive Rate–Distortion is analytically known. As an application, we provide bounds on the Predictive Rate–Distortion of natural language, improving on bounds provided by clustering sequences. Based on the results, we argue that the Predictive Rate–Distortion curve is more useful than the usual notion of statistical complexity for characterizing highly complex processes such as natural language.


Entropy ◽  
2018 ◽  
Vol 20 (8) ◽  
pp. 599
Author(s):  
Sarah Marzen

Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense ofthe process’ “intrinsic computation”. We discuss how statistical complexity changes with slight changes to the underlying model– in this case, a biologically-motivated dynamical model, that of aMonod-Wyman-Changeux molecule. Perturbations to kinetic rates cause statistical complexity to jump from finite to infinite. The same is not true for excess entropy, the mutual information between past and future, or for the molecule’s transfer function. We discuss the implications of this for the relationship between intrinsic and functional computation of biological sensory systems.


2018 ◽  
Vol 28 (7) ◽  
pp. 075312 ◽  
Author(s):  
Adam Rupe ◽  
James P. Crutchfield

2013 ◽  
Vol 34 (5) ◽  
pp. 587-594 ◽  
Author(s):  
Gustav Eje Henter ◽  
W. Bastiaan Kleijn
Keyword(s):  

2012 ◽  
Vol 19 (01) ◽  
pp. 1250007
Author(s):  
Wolfgang Löhr ◽  
Arleta Szkoła ◽  
Nihat Ay

We treat observable operator models (OOM) and their non-commutative generalisation, which we call NC-OOMs. A natural characteristic of a stochastic process in the context of classical OOM theory is the process dimension. We investigate its properties within the more general formulation, which allows one to consider process dimension as a measure of complexity of non-commutative processes: We prove lower semi-continuity, and derive an ergodic decomposition formula. Further, we obtain results on the close relationship between the canonical OOM and the concept of causal states which underlies the definition of statistical complexity. In particular, the topological statistical complexity, i.e. the logarithm of the number of causal states, turns out to be an upper bound to the logarithm of process dimension.


2011 ◽  
Vol 14 (05) ◽  
pp. 761-794 ◽  
Author(s):  
NICOLAS BRODU

This article introduces both a new algorithm for reconstructing epsilon-machines from data, as well as the decisional states. These are defined as the internal states of a system that lead to the same decision, based on a user-provided utility or pay-off function. The utility function encodes some a priori knowledge external to the system, it quantifies how bad it is to make mistakes. The intrinsic underlying structure of the system is modeled by an epsilon-machine and its causal states. The decisional states form a partition of the lower-level causal states that is defined according to the higher-level user's knowledge. In a complex systems perspective, the decisional states are thus the "emerging" patterns corresponding to the utility function. The transitions between these decisional states correspond to events that lead to a change of decision. The new REMAPF algorithm estimates both the epsilon-machine and the decisional states from data. Application examples are given for hidden model reconstruction, cellular automata filtering, and edge detection in images.


2002 ◽  
Vol 05 (01) ◽  
pp. 91-95 ◽  
Author(s):  
COSMA ROHILLA SHALIZI ◽  
JAMES P. CRUTCHFIELD

Discovering relevant, but possibly hidden, variables is a key step in constructing useful and predictive theories about the natural world. This brief note explains the connections between three approaches to this problem: the recently introduced information-bottleneck method, the computational mechanics approach to inferring optimal models, and Salmon's statistical relevance basis.


1999 ◽  
Vol 59 (1) ◽  
pp. 275-283 ◽  
Author(s):  
James P. Crutchfield ◽  
Cosma Rohilla Shalizi

Sign in / Sign up

Export Citation Format

Share Document