scholarly journals A Partial Information Decomposition Based on Causal Tensors

Author(s):  
David Sigtermans

We propose a partial information decomposition based on the newly introduced framework of causal tensors, i.e., multilinear stochastic maps that transform source data into destination data. The innovation that causal tensors introduce is that the framework allows for an exact expression of an indirect association in terms of the constituting, direct associations. This is not possible when expressing associations only in measures like mutual information or transfer entropy. Instead of a priori expressing associations in terms of mutual information or transfer entropy, the a posteriori expression of associations in these terms results in an intuitive definition of a nonnegative and left monotonic redundancy, which also meets the identity property. Our proposed redundancy satisfies the three axioms introduced by Williams and Beer. Symmetry and self-redundancy axioms follow directly from our definition. The data processing inequality ensures that the monotonicity axiom is satisfied. Because causal tensors can describe both mutual information as transfer entropy, the partial information decomposition applies to both measures. Results show that the decomposition closely resembles the decomposition of other another approach that expresses associations in terms of mutual information a posteriori. A negative synergistic term could indicate that there is an unobserved common cause.

Author(s):  
David Sigtermans

We propose a partial information decomposition based on the newly introduced framework of causal tensors, i.e., multilinear stochastic maps that transform source data into destination data. The innovation that causal tensors introduce is that the framework allows for an exact expression of an indirect association in terms of the constituting, direct associations. This is not possible when expressing associations only in measures like mutual information or transfer entropy. Instead of a priori expressing associations in terms of mutual information or transfer entropy, the a posteriori expression of associations in these terms results in an intuitive definition of a nonnegative and left monotonic redundancy. The proposed redundancy satisfies the three axioms introduced by William and Beer. The symmetry and self-redundancy axioms follow directly from our definition. The data processing inequality ensures that the monotonicity axiom is satisfied. Because causal tensors can be used to describe both mutual information as transfer entropy, the partial information decomposition applies to both measures. Results show that the decomposition closely resembles the decomposition of other another approach that expresses associations in terms of mutual information a posteriori.


Author(s):  
David Sigtermans

We propose a partial information decomposition based on the newly introduced framework of causal tensors, i.e., multilinear stochastic maps that transform source data into destination data. The innovation that causal tensors introduce is that the framework allows for an exact expression of an indirect association in terms of the constituting, direct associations. This is not possible when expressing associations only in measures like mutual information or transfer entropy. Instead of a priori expressing associations in terms of mutual information or transfer entropy, the a posteriori expression of associations in these terms results in an intuitive definition of a nonnegative and left monotonic redundancy, which also meets the identity property. Our proposed redundancy satisfies the three axioms introduced by Williams and Beer. Symmetry and self-redundancy axioms follow directly from our definition. The data processing inequality ensures that the monotonicity axiom is satisfied. Because causal tensors can describe both mutual information as transfer entropy, the partial information decomposition applies to both measures. Results show that the decomposition closely resembles the decomposition of other another approach that expresses associations in terms of mutual information a posteriori. A negative synergistic term could indicate that there is an unobserved common cause.


Author(s):  
Igor Agostini

In this chapter I argue the following thesis: 1) Descartes’s Meditations never formulate the problem of God’s existence as it is required by the precepts of order; in particular, the only problem of existence posed by Descartes after the classification of thoughts in the Third Meditation does not concern God directly, but generally aliqua res. 2) Though Descartes qualifies the two proofs of the Third Meditation as a posteriori, they cannot be considered as homologous in their structure to the traditional a posteriori proofs: they both—and the second in particular—contain components that are truly a priori. 3) The proof of the Fifth Meditation, as it starts from the true definition of God and God’s essence, does not constitute a quoadnos version of the a priori demonstration belonging to mathematics, but is, in a strict sense, a potissima demonstration that is at least as evident as those of mathematics.


1999 ◽  
Vol 21 (4) ◽  
pp. 369-397 ◽  
Author(s):  
Samuel Hollander ◽  
Sandra Peart

Our concern is John Stuart Mill's methodological pronouncements, his actual practice, and the relationship between them. We argue that verification played a key role in Mill's method, both in principle and in practice. Our starting point is the celebrated declaration regarding verification in the essay On the Definition of Political Economy; and on the Method of Investigation Proper to It (1836/ 1967; hereafter Essay): “By the method à priori we mean … reasoning from an assumed hypothesis; which … is the essence of all science which admits of general reasoning at all. To verify the hypothesis itself à posteriori, that is, to examine whether the facts of any actual case are in accordance with it, is no part of the business of science at all, but of the application of science” (Mill 1836/1967, p. 325). The apparent position that the basic economic theory is impervious to predictive failure emerges also in a sharp criticism of the à posteriori method:


2013 ◽  
Author(s):  
Justin B. Kinney ◽  
Gurinder S. Atwal

Motivated by data-rich experiments in transcriptional regulation and sensory neuroscience, we consider the following general problem in statistical inference. When exposed to a high-dimensional signal S, a system of interest computes a representation R of that signal which is then observed through a noisy measurement M. From a large number of signals and measurements, we wish to infer the "filter" that maps S to R. However, the standard method for solving such problems, likelihood-based inference, requires perfect a priori knowledge of the "noise function" mapping R to M. In practice such noise functions are usually known only approximately, if at all, and using an incorrect noise function will typically bias the inferred filter. Here we show that, in the large data limit, this need for a pre-characterized noise function can be circumvented by searching for filters that instead maximize the mutual information I[M;R] between observed measurements and predicted representations. Moreover, if the correct filter lies within the space of filters being explored, maximizing mutual information becomes equivalent to simultaneously maximizing every dependence measure that satisfies the Data Processing Inequality. It is important to note that maximizing mutual information will typically leave a small number of directions in parameter space unconstrained. We term these directions "diffeomorphic modes" and present an equation that allows these modes to be derived systematically. The presence of diffeomorphic modes reflects a fundamental and nontrivial substructure within parameter space, one that is obscured by standard likelihood-based inference.


OCL ◽  
2018 ◽  
Vol 25 (6) ◽  
pp. D604 ◽  
Author(s):  
Judith Burstin ◽  
Catherine Rameau ◽  
Virginie Bourion ◽  
Nadim Tayeh

Pea is the most widely cultivated grain legume crop in Europe. In the French research project PeaMUST, a large public and private sector partnership has been set up to undertake complementary strategies towards the development of high and stable yielding cultivars. These different strategies will contribute to the definition of a pea ideotype based on both a priori and a posteriori approaches. On the one hand, genomic selection will identify interesting genotypes which may display new phenotypic ideotypes. On the other hand, marker-assisted selection will enable cumulating resistance for a given or different stresses to reach more durably stable phenotypes. Moreover, mutations identified in candidate genes controlling aerial and root architecture will be tested for their effects on stress tolerance.


2014 ◽  
Vol 26 (4) ◽  
pp. 637-653 ◽  
Author(s):  
Justin B. Kinney ◽  
Gurinder S. Atwal

Motivated by data-rich experiments in transcriptional regulation and sensory neuroscience, we consider the following general problem in statistical inference: when exposed to a high-dimensional signal S, a system of interest computes a representation R of that signal, which is then observed through a noisy measurement M. From a large number of signals and measurements, we wish to infer the “filter” that maps S to R. However, the standard method for solving such problems, likelihood-based inference, requires perfect a priori knowledge of the “noise function” mapping R to M. In practice such noise functions are usually known only approximately, if at all, and using an incorrect noise function will typically bias the inferred filter. Here we show that in the large data limit, this need for a precharacterized noise function can be circumvented by searching for filters that instead maximize the mutual information I[M; R] between observed measurements and predicted representations. Moreover, if the correct filter lies within the space of filters being explored, maximizing mutual information becomes equivalent to simultaneously maximizing every dependence measure that satisfies the data processing inequality. It is important to note that maximizing mutual information will typically leave a small number of directions in parameter space unconstrained. We term these directions diffeomorphic modes and present an equation that allows these modes to be derived systematically. The presence of diffeomorphic modes reflects a fundamental and nontrivial substructure within parameter space, one that is obscured by standard likelihood-based inference.


Author(s):  
Priyedarshi Jetli

I argue for the possibility of knowledge by invention whch is neither á priori nor á posteriori. My conception of knowledge by invention evolves from Poincaré’s conventionalism, but unlike Poincaré’s conventions, propositions known by invention have a truth value. An individuating criteria for this type of knowledge is conjectured. The proposition known through invention is: gounded historically in the discipline to which it belongs; a result of the careful, sincere and objective quest and effort of the knower; chosen freely by the inventer or knower; and, private in its invention but public once invented. I extend knowledge by invention to include the knowledge of the invented proposition by those who do not invent it but accept it as a convention for good reasons. Finally, knowledge by invention combined with a revisionist, Platonist definition of knowledge as actively justified true belief provides a pedagogical model reviving the proactive spirit of the Socratic method with an emphasis on invention and activity and a de-emphasis on information gathering and passivity.


Author(s):  
Heinrich Schepers ◽  
Giorgio Tonelli ◽  
Rudolf Eisler
Keyword(s):  
A Priori ◽  

Sign in / Sign up

Export Citation Format

Share Document