Deviance and Vice: Strength as a Theoretical Virtue in the Epistemology of Logic

2018 ◽  
Vol 99 (3) ◽  
pp. 548-563 ◽  
Author(s):  
Gillian Russell

The Monist ◽  
1989 ◽  
Vol 72 (1) ◽  
pp. 117-133 ◽  
Author(s):  
Dallas Willard ◽  


dialectica ◽  
2012 ◽  
Vol 66 (2) ◽  
pp. 221-236 ◽  
Author(s):  
Paul BOGHOSSIAN


Philosophy ◽  
2020 ◽  
Author(s):  
Samuel Schindler

A theoretical virtue in science is a property of a scientific theory that is considered desirable. Standard theoretical virtues include testability, empirical accuracy, simplicity, unification, consistency, coherence, and fertility. First highlighted by Thomas S. Kuhn in a seminal paper in 1977, theoretical virtues have come to play an important role in a number of philosophical debates. A central bone of contention in many of these debates is whether theoretical virtues are epistemic, i.e., whether they are indicative of a theory’s correctness, or whether they are just pragmatic, concerning only the convenient use of a theory. Particularly contested virtues are simplicity and unifying power. In the scientific realism debate, in which philosophers argue about whether or not scientific theories allow us to uncover the reality behind the phenomena, scientific realists have argued that virtuous theories are more likely to be correct than a less virtuous ones, even when they accommodate the same data. In the closely related debate about the so-called Inference to the Best Explanation, realists have argued that not only can we determine the best explanation on the basis of its virtues, but we can also determine which explanation is the true one. In discussions about “theory choice” or “theory appraisal,” philosophers discuss which virtues might be most decisive in scientists’ deliberations about which theory they should adopt. Here a theory’s successful novel predictions, or novel successes for short, have been a particular focus. Philosophers have also discussed possible trade-offs between various virtues and the difficulties which these may pose for theory choice. Samir Okasha has argued recently that there cannot be any rational algorithm for theory choice. Theoretical virtues also play a role in philosophical accounts of the laws of nature. One extremely prominent account, namely David Lewis’ Best System Analysis, appeals to simplicity and unifying power to determine what generalizations qualify as genuine laws of nature (rather than just accidentally true generalizations). Even in philosophical theorizing about science, theoretical virtues have been appealed to: Rudolf Carnap believed that simplicity and fruitfulness were important desiderata guiding the explication of scientific concepts. Finally, psychologists have started to investigate the role of theoretical virtues in picking explanations. There is work that appears to show that children and adults have preferences for simple and broad theories.



Author(s):  
Jared Warren

What is the source of logical and mathematical truth? This volume revitalizes conventionalism as an answer to this question. Conventionalism takes logical and mathematical truth to have their source in linguistic conventions. This was an extremely popular view in the early 20th century, but it was never worked out in detail and is now almost universally rejected in mainstream philosophical circles. In Shadows of Syntax, Jared Warren offers the first book-length treatment and defense of a combined conventionalist theory of logic and mathematics. He argues that our conventions, in the form of syntactic rules of language use, are perfectly suited to explain the truth, necessity, and a priority of logical and mathematical claims. In Part I, Warren explains exactly what conventionalism amounts to and what linguistic conventions are. Part II develops an unrestricted inferentialist theory of the meanings of logical constants that leads to logical conventionalism. This conventionalist theory is elaborated in discussions of logical pluralism, the epistemology of logic, and of the influential objections that led to the historical demise of conventionalism. Part III aims to extend conventionalism from logic to mathematics. Unlike logic, mathematics involves both ontological commitments and a rich notion of truth that cannot be generated by any algorithmic process. To address these issues Warren develops conventionalist-friendly but independently plausible theories of both metaontology and mathematical truth. Finally, Part IV steps back to address big picture worries and meta-worries about conventionalism. This book develops and defends a unified theory of logic and mathematics according to which logical and mathematical truths are reflections of our linguistic rules, mere shadows of syntax.



2020 ◽  
pp. 93-107
Author(s):  
Paul Boghossian ◽  
Timothy Williamson

This essay attempts to clarify the project of explaining the possibility of ‘blind reasoning’—namely, of basic logical inferences to which we are entitled without our having an explicit justification for them. The role played by inferentialism in this project is examined and objections made to inferentialism by Paolo Casalegno and Timothy Williamson are answered. Casalegno proposes a recipe for formulating a counterexample to any proposed constitutive inferential role by imaging a subject who understands the logical constant in question but fails to have the capacity to make the inference in question; Williamson’s recipe turns on imagining an expert who continues to understand the constant in question while having developed sophisticated considerations for refusing to make it. It’s argued that neither recipe succeeds.



Author(s):  
Greg Frost-Arnold

Quine’s philosophical views did not emerge fully formed in the 1930s; rather, they changed over the seven decades he was philosophically active. This chapter investigates two episodes in Quine’s ontological development: his engagement with Pythagoreanism (an Appendix with new primary sources is included), and his conversion from nominalism to Platonism about mathematics. These two topics might seem completely distinct. However, although they could conceivably be treated separately, this chapter treats them together by considering the role clarity plays in both these episodes. Quine’s changing views about the theoretical virtue of clarity, and which particular things are clear and which are not, help explain his ontological development. In particular, the chapter offers a new hypothesis about the causes of Quine’s conversion from nominalism to realism, in which his views about clarity play an essential role.



2020 ◽  
pp. 153-170
Author(s):  
Jared Warren

This chapter shows that unrestricted inferentialism/conventionalism leads to a naturalistically satisfying account of our a priori knowledge of logical validity. The chapter first lays the groundwork by discussing the general question of what conditions arguments need to meet in order to lead to knowledge of their conclusions. Following Boghossian, the chapter then argues that inferentialism/conventionalism is particularly well posed to allow rule-circular arguments to lead to a priori knowledge of the validity of our basic rules. Restricted inferentialists were often forced to complicate and sometimes abandon their accounts of logical knowledge in the face of bad company. By contrast, unrestricted inferentialism has no problem at all with bad company. All told, conventionalism gives a naturalistic account of our a priori knowledge of logic.



2010 ◽  
Vol 24 (1) ◽  
pp. 437-464 ◽  
Author(s):  
Joshua Schechter




2019 ◽  
Vol 5 (1) ◽  
pp. 48-61
Author(s):  
Elliot D. Cohen ◽  

This article describes some core elements of Logic-Based Therapy and Consultation and examines some of their epistemic properties.



Sign in / Sign up

Export Citation Format

Share Document