A Mechanized Proof of the Max-Flow Min-Cut Theorem for Countable Networks with Applications to Probability Theory

Author(s):  
Andreas Lochbihler
Keyword(s):  
Author(s):  
Haiqi WANG ◽  
Sheqin DONG ◽  
Tao LIN ◽  
Song CHEN ◽  
Satoshi GOTO
Keyword(s):  

Author(s):  
Renáta Bartková ◽  
Beloslav Riečan ◽  
Anna Tirpáková

The reference considers probability theory in two main domains: fuzzy set theory, and quantum models. Readers will learn about the Kolmogorov probability theory and its implications in these two areas. Other topics covered include intuitionistic fuzzy sets (IF-set) limit theorems, individual ergodic theorem and relevant statistical applications (examples from correlation theory and factor analysis in Atanassov intuitionistic fuzzy sets systems, the individual ergodic theorem and the Poincaré recurrence theorem). This book is a useful resource for mathematics students and researchers seeking information about fuzzy sets in quantum spaces.


Author(s):  
Timothy McGrew

The mid-20th century consensus regarding Hume’s critique of reported miracles has broken down dramatically in recent years thanks to the application of probabilistic analysis to the issue and the rediscovery of its history. Progress from this point forward is likely to be made along one or more of three fronts. There is wide room for interdisciplinary collaboration, work that will bring together scholars with expertise in religion, psychology, philosophy, and empirical science. There is a great deal of work still to be done in formal analysis, making use of the tools of modern probability theory to model questions about testimony and inference. And the recovery and study of earlier works on the subject—works that should never have been forgotten—can significantly enrich our understanding of the underlying issues.


Author(s):  
Margaret Morrison

After reviewing some of the recent literature on non-causal and mathematical explanation, this chapter develops an argument as to why renormalization group (RG) methods should be seen as providing non-causal, yet physical, information about certain kinds of systems/phenomena. The argument centres on the structural character of RG explanations and the relationship between RG and probability theory. These features are crucial for the claim that the non-causal status of RG explanations involves something different from simply ignoring or “averaging over” microphysical details—the kind of explanations common to statistical mechanics. The chapter concludes with a discussion of the role of RG in treating dynamical systems and how that role exemplifies the structural aspects of RG explanations which in turn exemplifies the non-causal features.


Author(s):  
Paul Humphreys

Paul Humphreys pioneered philosophical investigations into the methodological revolution begun by computer simulations. He has also made important contributions to the contemporary literature on emergence by developing the fusion account of diachronic emergence and its generalization, transformational emergence. He is the discoverer of what has come to be called “Humphreys” Paradox in probability theory and has also made influential contributions to the literature on probabilistic causality and scientific explanation. This collection contains fourteen of his previously published papers on topics ranging from numerical experiments to the status of scientific metaphysics. There is also and a previously unpublished paper on social dynamics. The volume is divided into four parts on, respectively, computational science, emergence, probability, and general philosophy of science. The first part contains the seminal 1990 paper on computer simulations, with three other papers arguing that these new methods cannot be accounted for by traditional methodological approaches. The second part contains the original presentation of fusion emergence and three companion papers arguing for diachronic approaches to the topic, rather than the then dominant synchronic accounts. The third part starts with the paper that introduced the probabilistic paradox followed by a later evaluation of attempts to solve it. A third paper argues, contra Quine, that probability theory is a purely mathematical theory. The final part includes papers on causation, explanation, metaphysics, and an agent-based model that shows how endogenous uncertainty undermines utility maximization. Each of the four parts is followed by a comprehensive postscript with retrospective assessments.


Author(s):  
Jochen Rau

Statistical mechanics concerns the transition from the microscopic to the macroscopic realm. On a macroscopic scale new phenomena arise that have no counterpart in the microscopic world. For example, macroscopic systems have a temperature; they might undergo phase transitions; and their dynamics may involve dissipation. How can such phenomena be explained? This chapter discusses the characteristic differences between the microscopic and macroscopic realms and lays out the basic challenge of statistical mechanics. It suggests how, in principle, this challenge can be tackled with the help of conservation laws and statistics. The chapter reviews some basic notions of classical probability theory. In particular, it discusses the law of large numbers and illustrates how, despite the indeterminacy of individual events, statistics can make highly accurate predictions about totals and averages.


Author(s):  
John Oberdiek

Any normative framework of risk imposition must include at its foundation an account of the nature of risk imposition. If risk is understood as the probability of a bad event or harm, the menu of conceptions of risk would seem to be exhausted by the various accounts of probability that have been developed. The two main families of probability theory, objective and subjective, have opposite strengths and weaknesses as candidate conceptions of probability suitable for a normative framework of risk imposition. This chapter argues that objective accounts are suitably normative but insufficiently practical, while unreconstructed subjective accounts are suitably practical but insufficiently normative, and this casts doubt on the project of identifying a conception of risk that is suitable for a normative framework of risk imposition.


1981 ◽  
Vol 48 (2) ◽  
pp. 317-322 ◽  
Author(s):  
Hugues Leblanc
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document