de finetti
Recently Published Documents


TOTAL DOCUMENTS

199
(FIVE YEARS 20)

H-INDEX

21
(FIVE YEARS 1)

2022 ◽  
pp. 1-37
Author(s):  
Mikayla Kelley

Abstract There is a well-known equivalence between avoiding accuracy dominance and having probabilistically coherent credences (see, e.g., de Finetti 1974, Joyce 2009, Predd et al. 2009, Pettigrew 2016). However, this equivalence has been established only when the set of propositions on which credence functions are defined is finite. In this paper, I establish connections between accuracy dominance and coherence when credence functions are defined on an infinite set of propositions. In particular, I establish the necessary results to extend the classic accuracy argument for probabilism to certain classes of infinite sets of propositions including countably infinite partitions.


Erkenntnis ◽  
2021 ◽  
Author(s):  
Daniel Rothschild

AbstractOn the Lockean thesis one ought to believe a proposition if and only if one assigns it a credence at or above a threshold (Foley in Am Philos Q 29(2):111–124, 1992). The Lockean thesis, thus, provides a way of characterizing sets of all-or-nothing beliefs. Here we give two independent characterizations of the sets of beliefs satisfying the Lockean thesis. One is in terms of betting dispositions associated with full beliefs and one is in terms of an accuracy scoring system for full beliefs. These characterizations are parallel to, but not merely derivative from, the more familiar Dutch Book (de Finetti in Theory of probability, vol 1, Wiley, London, 1974) and accuracy (Joyce in Philos Sci 65(4):575–603, 1998) arguments for probabilism.


Author(s):  
David Gross ◽  
Sepehr Nezami ◽  
Michael Walter

AbstractSchur–Weyl duality is a ubiquitous tool in quantum information. At its heart is the statement that the space of operators that commute with the t-fold tensor powers $$U^{\otimes t}$$ U ⊗ t of all unitaries $$U\in U(d)$$ U ∈ U ( d ) is spanned by the permutations of the t tensor factors. In this work, we describe a similar duality theory for tensor powers of Clifford unitaries. The Clifford group is a central object in many subfields of quantum information, most prominently in the theory of fault-tolerance. The duality theory has a simple and clean description in terms of finite geometries. We demonstrate its effectiveness in several applications: We resolve an open problem in quantum property testing by showing that “stabilizerness” is efficiently testable: There is a protocol that, given access to six copies of an unknown state, can determine whether it is a stabilizer state, or whether it is far away from the set of stabilizer states. We give a related membership test for the Clifford group. We find that tensor powers of stabilizer states have an increased symmetry group. Conversely, we provide corresponding de Finetti theorems, showing that the reductions of arbitrary states with this symmetry are well-approximated by mixtures of stabilizer tensor powers (in some cases, exponentially well). We show that the distance of a pure state to the set of stabilizers can be lower-bounded in terms of the sum-negativity of its Wigner function. This gives a new quantitative meaning to the sum-negativity (and the related mana) – a measure relevant to fault-tolerant quantum computation. The result constitutes a robust generalization of the discrete Hudson theorem. We show that complex projective designs of arbitrary order can be obtained from a finite number (independent of the number of qudits) of Clifford orbits. To prove this result, we give explicit formulas for arbitrary moments of random stabilizer states.


Author(s):  
Mario Berta ◽  
Francesco Borderi ◽  
Omar Fawzi ◽  
Volkher B. Scholz

AbstractWe give asymptotically converging semidefinite programming hierarchies of outer bounds on bilinear programs of the form $${\mathrm {Tr}}\big [H(D\otimes E)\big ]$$ Tr [ H ( D ⊗ E ) ] , maximized with respect to semidefinite constraints on D and E. Applied to the problem of approximate error correction in quantum information theory, this gives hierarchies of efficiently computable outer bounds on the success probability of approximate quantum error correction codes in any dimension. The first level of our hierarchies corresponds to a previously studied relaxation (Leung and Matthews in IEEE Trans Inf Theory 61(8):4486, 2015) and positive partial transpose constraints can be added to give a sufficient criterion for the exact convergence at a given level of the hierarchy. To quantify the worst case convergence speed of our sum-of-squares hierarchies, we derive novel quantum de Finetti theorems that allow imposing linear constraints on the approximating state. In particular, we give finite de Finetti theorems for quantum channels, quantifying closeness to the convex hull of product channels as well as closeness to local operations and classical forward communication assisted channels. As a special case this constitutes a finite version of Fuchs-Schack-Scudo’s asymptotic de Finetti theorem for quantum channels. Finally, our proof methods answer a question of Brandão and Harrow (Proceedings of the forty-fourth annual ACM symposium on theory of computing, STOC’12, p 307, 2012) by improving the approximation factor of de Finetti theorems with no symmetry from $$O(d^{k/2})$$ O ( d k / 2 ) to $${\mathrm {poly}}(d,k)$$ poly ( d , k ) , where d denotes local dimension and k the number of copies.


Author(s):  
Carlo Zappia

This paper explores archival material concerning the reception of Leonard J. Savage’s foundational work of rational choice theory in its subjective-Bayesian form. The focus is on the criticism raised in the early 1960s by Daniel Ellsberg, William Fellner, and Cedric Smith, who were supporters of the newly developed subjective approach but could not understand Savage’s insistence on the strict version he shared with Bruno de Finetti. The episode is well known, thanks to the so-called Ellsberg Paradox and the extensive reference made to it in current decision theory. But Savage’s reaction to his critics has never been examined. Although Savage never really engaged with the issue in his published writings, the private exchange with Ellsberg and Fellner, and with de Finetti about how to deal with Smith, shows that Savage’s attention to the generalization advocated by his correspondents was substantive. In particular, Savage’s defense of the normative value of rational choice theory against counterexamples such as Ellsberg’s did not prevent him from admitting that he would give careful consideration to a more realistic axiomatic system, should the critics be able to provide one.


2021 ◽  

The volume collects nine essays on Italian politics, economics, and law during Fascism. Some are dedicated to the ideal objectives of corporatism, aiming at the renewal of politics, institutions and culture, and the objectively dismal results of the implemented policies. Economic researches analyze the debated abolishment of the inheritance tax in the 1923, and the various policies proposed by some Italian economists to counter the disastrous effects of the Great Depression. Specific attention is also given to the problem of the development of Italy’s southern regions. An essay is further dedicated to the influence of corporatism and idealism on the mathematical economist Bruno de Finetti. In the field of law, authors investigate the long lasting features impressed by Fascism on Italian administrative law and, in general, the permanence of a typically Fascist magniloquent style in the Italian jurisdictional language. Lastly, as suggested by title of this volume, a chapter analyses the social and political thinking of Carlo Rosselli, leading anti-fascist intellectual who paid dearly for his dissent.


2021 ◽  
pp. 197-212
Author(s):  
Mario Pomini

Bruno de Finetti (1906-1985) is well known as the founder of the subjective theory of probability. Less known, with a few exceptions, is his contribution to economic theory during the early stage of his scientific career. In the second half of the 1930s, de Finetti was passionately involved in the field of welfare economics. To provide a theoretical framework for evaluating social welfare and to help in designing public policies, he developed a new mathematical tool: the theory of simultaneous maxima. Using this analytical approach, he also advanced the idea of a social welfare function, albeit quite different from the one introduced in 1938 by Abram Bergson, reflecting the debate on the economic planning among Italian economists.


Sign in / Sign up

Export Citation Format

Share Document