Nomic Probability and the Foundations of Induction
Latest Publications


TOTAL DOCUMENTS

11
(FIVE YEARS 0)

H-INDEX

1
(FIVE YEARS 0)

Published By Oxford University Press

9780195060133, 9780197560129

Author(s):  
John L. Pollock

It was argued in Chapter 1 that various kinds of propensities make intuitive sense, but existing propensity theories do little to clarify their nature. We are now in a position to give precise definitions for several different kinds of propensities in terms of nomic probabilities. The characteristics of these propensities can then be deduced from the theory of nomic probability. The technique for defining propensities in terms of nomic probabilities is modeled on the definitions of objective and physical/epistemic definite probabilities proposed in Chapter 4. We often want to know how probable it is that P would be true if Q were true. This is a kind of counterfactual probability, and I will symbolize it as ┌prob(P/Q)┐. Counterfactual conditionals constitute the limiting case of counterfactual probability, in the sense that if (Q > P) obtains then prob(P/Q) = 1. prob(PlQ) is to be distinguished from prob(P/g) and PROB(P/(2), either of which can be regarded as an “indicative” probability that P is true if Q is true. Recall that (Q > P) obtains iff P obtains at every nearest world at which Q obtains. In other words, where M(0 is the set of nearest ^-worlds, (Q > P) obtains iff M(Q) ⊆ |P|. Analogously, prob(P/Q) can be regarded as a measure of the proportion of nearest g-worlds that are also P-worlds. This has the immediate result that if (Q > P) obtains then prob(PIQ) = 1 (but not conversely). This sort of heuristic description of counterfactual probability enables us to investigate its formal properties, but if the notion is to be of any real use we must do more than wave our hands and talk about an unspecified measure on M(Q). We are now in a position to accurately define counterfactual probability. Let CQ be the conjunction of all of the counterfactual consequences of Q, that is, the conjunction of all states of affairs R such that (Q > R) obtains.


Author(s):  
John L. Pollock

I have urged that nomic probability be analyzed in terms of its conceptual role. The conceptual role analysis of nomic probability has four parts: (1) an account of statistical induction; (2) an account of the computational principles that allow some nomic probabilities to be derived from others; (3) an account of acceptance rules; and (4) an account of direct inference. The purpose of the present chapter is to develop and defend the acceptance rules that will play a central role in the theory of nomic probability. The theories of direct inference and statistical induction will then be derived from the acceptance rules and the computational principles defended in the last chapter. Although some of the computational principles are novel, they still amount to little more than an embellishment of the classical probability calculus. The main philosophical weight of the theory of nomic probability will be borne by the acceptance rules. A simple acceptance rule will be described and defended in section 2. The epistemological framework presupposed by the rule will be discussed and refined in section 3. Sections 4 and 5 will demonstrate that more powerful rules can be derived from the simple acceptance rule described in section 2. The philosophical literature contains numerous proposals for probabilistic acceptance rules. For instance, the following “Simple Rule” has had a number of proponents: . . . Belief in P is justified iff P is probable. . . . Note, however, that this rule is formulated in terms of definite probabilities. This is true of most candidate acceptance rules. However, nomic probability is an indefinite probability. It would make no sense to propose a rule like the Simple Rule for nomic probability. Nevertheless, there is an obvious candidate for an acceptance rule formulated in terms of nomic probability. This is the Statistical Syllogism, whose traditional formulation is something like the following: . . . Most A’s are B’s. This is an A./ Therefore, this is a E. . . . It seems clear that we often reason in roughly this way. For instance, on what basis do I believe what I read in the newspaper?


Author(s):  
John L. Pollock

Probability theorists divide into two camps-the proponents of subjective probability and the proponents of objective probability. Opinion has it that subjective probability has carried the day, but I think that such a judgment is premature. I have argued elsewhere that there are deep incoherencies in the notion of subjective probability. Accordingly, I find myself in the camp of objective probability. The consensus is, however, that the armies of objective probability are in even worse disarray. The purpose of this book is to construct a theory of objective probability that rectifies that. Such a theory must explain the meaning of objective probability, show how we can discover the values of objective probabilities, clarify their use in decision theory, and demonstrate how they can be used for epistemological purposes. The theory of nomic probability aims to do all that. This book has two main objectives. First, it will propose a general theory of objective probability. Second, it will, in a sense to be explained, propose a solution to the problem of induction. These two goals are intimately connected. I will argue that a solution to the problem of induction is forthcoming, ultimately, from an analysis of probabilistic reasoning. Under some circumstances, probabilistic reasoning justifies us in drawing non-probabilistic conclusions, and this kind of reasoning underlies induction. Conversely, an essential part of understanding probability consists of providing an account of how we can ascertain the values of probabilities, and the most fundamental way of doing that is by using a species of induction. In statistical induction we observe the relative frequency (the proportion) of A's in a limited sample of B's, and then infer that the probability of a B being an A is approximately the same as that relative frequency. To provide philosophical foundations for probability we must, among other things, explain precisely how statistical induction works and what justifies it. Probability is important both in and out of philosophy. Much of the reasoning of everyday life is probabilistic. We look at the clouds and judge whether it is going to rain by considering how often clouds like that have spawned rain in the past.


Author(s):  
John L. Pollock

It is well known, since Goodman [1955], that principles of induction require a projectibility constraint. On the present account, such a constraint is inherited from the projectibility constraint on (A1)–(A3). It remains to be shown, however, that this derived constraint is the intuitively correct constraint. Let us define: (1.1) A concept B (or the corresponding property) is inductively projectible with respect to a concept A (or the corresponding property) iff ┌X is a set of A’s, and all the members of X are also B’s┐ is a prima facie reason for ┌A ⇒ B┐, and this prima facie reason would not be defeated by learning that there are non-B’s. A nomic generalization is projectible iff its consequent is inductively projectible with respect to its antecedent. What is needed is an argument to show that inductive projectibility is the same thing as projectibility. To make this plausible, I will argue that inductive projectibility has the same closure properties as those defended for projectibility in Chapter 3. Goodman introduced inductive projectibility with examples of nonprojectible concepts like grue and bleen, and the impression has remained that only a few peculiar concepts fail to be inductively projectible. That, however, is a mistake. It is not difficult to show that most concepts fail to be inductively projectible, inductive projectibility being the exception rather than the rule. This results from the fact that, just like projectibility, the set of inductively projectible concepts is not closed under most logical operations. In particular, I will argue below that although inductive projectibility is closed under conjunction, it is not closed under either disjunction or negation. That is, negations or disjunctions of inductively projectible concepts are not automatically inductively projectible. Just as for projectibility, we can argue fairly conclusively that inductive projectibility is closed under conjunction. More precisely, the following two principles hold: (1.2) If A and B are inductively projectible with respect to C, then (A&B) is inductively projectible with respect to C. (1.3) If A is inductively projectible with respect to both B and C, then A is inductively projectible with respect to (B&C).


Author(s):  
John L. Pollock

There once was a man who wrote a book. He was very careful in his reasoning, and was confident of each claim that he made. With some display of pride, he showed the book to a friend (who happened to be a probability theorist). He was dismayed when the friend observed that any book that long and that interesting was almost certain to contain at least one falsehood. Thus it was not reasonable to believe that all of the claims made in the book were true. If it were reasonable to believe each claim then it would be reasonable to believe that the book contained no falsehoods, so it could not be reasonable to believe each claim. Furthermore, because there was no way to pick out some of the claims as being more problematic than others, there could be no reasonable way of withholding assent to some but not others. “Therefore,” concluded his friend, “you are not justified in believing anything you asserted in the book.” This is the paradox of the preface (so named because in the original version the author confesses in the preface that his book probably contains a falsehood). The paradox of the preface is more than a curiosity. It has been used by some philosophers to argue that the set of one's warranted beliefs need not be deductively consistent, and by others to argue that you should not befriend probability theorists. If (Al) is to be a correct acceptance rule it must be capable of explaining what is involved in the paradox of the preface. The lottery paradox and the paradox of the preface seem superficially similar, so it might be supposed that a resolution of one will automatically generate a resolution of the other in some trivial manner. But in fact, the opposite is true. It is the principle of collective defeat that makes possible the resolution of the lottery paradox, but it is the principle of collective defeat that is responsible for the creation of the paradox of the preface.


Author(s):  
John L. Pollock

The purpose of this book is to clarify probability concepts and analyze the structure of probabilistic reasoning. The intent is to give an account that is precise enough to actually be useful in philosophy, decision theory, and statistics. An ultimate objective will be to implement the theory of probabilistic reasoning in a computer program that models human probabilistic reasoning. The result will be an AI system that is capable of doing sophisticated scientific reasoning. However, that takes us beyond the scope of the present book. The purpose of this chapter is to give a brief restatement of the main points of the theory of nomic probability and provide an assessment of its accomplishments. The theory of nomic probability has a parsimonious basis. This consists of two sets of principles. First, there are the epistemic principles (A3) and (D3):(A3) If F is projectible with respect to G and r > .5, then ┌prob(F/G) > r┐ is a prima facie reason for the conditional ┌Gc ⊃ Fc┐, the strength of the reason depending upon the value of r. (D3) If F is projectible with respect to H then ┌Hc & prob(F/G&H) < prob(F/G) ┐ is an undercutting defeater for rprob(F/G) > r┐ as a prima facie reason for ┌Gc ⊃ Fc┐. Second, there are some computational principles that generate a calculus of nomic probabilities. These principles jointly constitute the conceptual role of the concept of nomic probability and are the basic principles from which the entire theory of nomic probability follows. The epistemic principles presuppose a prior epistemological framework governing the interaction of prima facie reasons and defeaters. Certain aspects of that framework play an important role in the theory of nomic probability. For example, the principle of collective defeat is used recurrently throughout the book. The details of the epistemological framework are complicated, but they are not specific to the theory of probability. They are part of general epistemology. The computational principles are formulated in terms of what some will regard as an extravagant ontology of sets of possible objects and possible worlds. It is important to realize that this ontology need not be taken seriously.


Author(s):  
John L. Pollock

Our task is to characterize nomic probability in terms of its conceptual role—its role in rational thought. One of the most important ingredients of that role concerns the use of probability in practical decisions. To repeat Bishop Butler’s famous aphorism, “probability is the very guide to life”. In choosing a course of action we often seek to maximize expectation value, and the expectation value of an act is defined in terms of the probabilities of various possible outcomes conditional on that act. The latter are definite probabilities, so an account of the conceptual role of nomic probability must include an account of its relationship to these definite probabilities. This must include an analysis of the definite probabilities themselves and a theory of direct inference explaining how the definite probabilities can be evaluated on the basis of our knowledge of nomic probabilities. That is the topic of the present chapter. This chapter will take some rather surprising twists. First, we will find that despite its central role in the theory of nomic probability, direct inference does not constitute a primitive part of that theory. It will be possible to define definite probabilities in terms of nomic probabilities and to derive the requisite theory of direct inference from those parts of the theory of nomic probability already at our disposal. In the course of establishing this we will discover something even more surprising, and ultimately more important. Rules of direct inference warrant inferences from (indefinite) nomic probabilities to definite probabilities, but they will be derived from rules describing parallel inferences from nomic probabilities to nomic probabilities. The latter inferences are importantly similar to classical direct inference, so I take them to constitute what I call ‘nonclassical direct inference’. The rules of nonclassical direct inference will in turn be derived from the acceptance rules and computational principles that have already been defended. Nonclassical direct inference has been entirely overlooked by probabilists, and yet it is of fundamental importance.


Author(s):  
John L. Pollock

Exotic computational principles are those not derivable from the theory of proportions constructed in Chapter 2. The most important of these principles are (PFREQ) and (AGREE). The purpose of this chapter is to show that these principles can be derived from a strengthened theory of proportions—what we might call ‘the exotic theory of proportions’. The principles of the theory of proportions constructed in Chapter 2 seem completely unproblematic, and accordingly the derivations of principles of nomic probability can reasonably be regarded as proofs of those principles. That ceases to be the case when we turn to the exotic theory of proportions and the corresponding exotic principles of nomic probability. Although quite intuitive, the exotic axioms for proportions are also very strong and correspondingly riskier. Furthermore, although the exotic axioms are intuitive, intuitions become suspect at this level. The problem is that there are a large number of intuitive candidates for exotic axioms, and although each is intuitive by itself, they are jointly inconsistent. This will be illustrated below. It means that we cannot have unqualified trust in our intuitions. In light of this, (PFREQ) and (AGREE) seem more certain than the exotic principles of proportions from which they can be derived. As such, those derivations cannot reasonably be regarded as justifications for the probability principles. Instead they are best viewed as explanations for why the probability principles are true given the characterization of nomic probabilities in terms of proportions. The derivations play an explanatory role rather than a justificatory role. The exotic principles of proportions concern proportions in relational sets. Recall that we can compare sizes of sets with the relation ┌X ⇆Y┐, which was defined as ┌מ(X/X∪Y) = מ(Y/X∪Y) ┐. Our first exotic principle relates the size of a binary relation (a “two-dimensional set”) to the sizes of its one-dimensional segments. If x is in the domain D(R) of R, let Rx be the Rprojection of x, i.e., {y| Rxy}. Suppose D(R) = D(S), and for each x in their domain, Rx ⇆ Sx. Then their “linear dimensions” are everywhere the same.


Author(s):  
John L. Pollock

The objective of this book is to provide an analysis of nomic probability in terms of its conceptual role. This requires an account both of what inferences can be drawn from nomic probabilities, and how nomic probabilities can be evaluated on the basis of nonprobabilistic information. We have an account of. That consists of the acceptance rules and the theory of direct inference. The theory of nonclassical direct inference also provides a partial account of. Some nomic probabilities can be evaluated in terms of others by using nonclassical direct inference. But in order to get this process started in the first place, there must be some other way of evaluating nomic probabilities that does not require any prior contingent knowledge of the values of such probabilities. It seems intuitively clear that that is accomplished by some kind of statistical induction. In statistical induction, we observe a sample of B’s, determine the relative frequency of A’s in that sample, and then estimate prob(A/B) to be approximately equal to that relative frequency. A close kin to statistical induction is enumerative induction, wherein it is observed that all of the B’s in the sample are A’s, and it is concluded that any A would be a B, that is, A ⇒ B. There are two possibilities regarding statistical and enumerative induction. They could be derivable from more basic epistemic principles, or they might be irreducible constituents of the conceptual role of nomic probability and nomic generalizations. These two possibilities reflect what have come to be regarded as two different problems of induction. The traditional problem of induction was that of justifying induction. But most contemporary philosophers have forsaken that for Goodman’s “new riddle of induction”, which I am construing here as the problem of giving an accurate account of correct principles of induction. This change in orientation reflects the view that principles of induction are basic epistemic principles, partly constitutive of rationality, and not reducible to or justifiable on the basis of anything more fundamental. I endorsed the latter view in my [1974], but now I am convinced that it is false.


Author(s):  
John L. Pollock

Much of the usefulness of probability derives from its rich logical and mathematical structure. That structure comprises the probability calculus. The classical probability calculus is familiar and well understood, but it will turn out that the calculus of nomic probabilities differs from the classical probability calculus in some interesting and important respects. The purpose of this chapter is to develop the calculus of nomic probabilities, and at the same time to investigate the logical and mathematical structure of nomic generalizations. The mathematical theory of nomic probability is formulated in terms of possible worlds. Possible worlds can be regarded as maximally specific possible ways things could have been. This notion can be filled out in various ways, but the details are not important for present purposes. I assume that a proposition is necessarily true iff it is true at all possible worlds, and I assume that the modal logic of necessary truth and necessary exemplification is a quantified version of S5. States of affairs are things like Mary’s baking pies, 2 being the square root of 4, Martha’s being smarter than John, and the like. For present purposes, a state of affairs can be identified with the set of all possible worlds at which it obtains. Thus if P is a state of affairs and w is a possible world, P obtains at w iff w∊P. Similarly, we can regard monadic properties as sets of ordered pairs ⧼w,x⧽ of possible worlds and possible objects. For example, the property of being red is the set of all pairs ⧼w,x⧽ such that w is a possible world and x is red at w. More generally, an n-place property will be taken to be a set of (n+l)-tuples ⧼w,x1...,xn⧽. Given any n-place concept α, the corresponding property of exemplifying a is the set of (n + l)-tuples ⧼w,x1,...,xn⧽ such that x1,...,xn exemplify α at the possible world w. States of affairs and properties can be constructed out of one another using logical operators like conjunction, negation, quantification, and so on.


Sign in / Sign up

Export Citation Format

Share Document