scholarly journals Explicationist Epistemology and the Explanatory Role of Knowledge

Author(s):  
Erik J. Olsson

AbstractIt has been argued that much of contemporary epistemology can be unified under Carnap’s methodology of explication, which originated in the neighboring field of philosophy of science. However, it is unclear to what extent epistemological theories that emphasize the explanatory role of knowledge fit into this picture, Kornblith’s natural kind epistemology and Williamson’s knowledge first approach being cases in point. In this connection, I raise three questions. Can we harvest the insights of these approaches without loss in the more standard and less idiosyncratic explicationist framework? Can we do so without falling prey to prominent criticism raised against those approaches? Finally, do the approaches come out as coherent under an explicationist rendering? I argue that in Kornblith’s case the answer to all three questions is essentially in the affirmative. Much of the knowledge first approach is also translatable into explicationism. However, from that perspective, Williamson’s central argument for treating knowledge as undefinable, referring to persistent yet unsuccessful attempts to solve the Gettier problem, amounts to an overreaction to that problem. Leaving explicationism aside, I ask, in the penultimate section, what Williamson’s own philosophical method really amounts to.

This is an edited collection of twenty-three new papers on the Gettier Problem and the issues connected with it. The set of authors includes many of the major figures in contemporary epistemology who have developed some of the well-known responses to the problem, and it also contains some younger epistemologists who bring new perspectives to the issues raised in the literature. Together, they cover the state of the art on virtually every epistemological and methodological aspect of the Gettier Problem. The volume also includes some skeptical voices according to which the Gettier Problem is not deeply problematic or some of the problems it raises are not genuine philosophical problems.


Dialogue ◽  
2008 ◽  
Vol 47 (3-4) ◽  
pp. 565-582 ◽  
Author(s):  
Byeong D. Lee

ABSTRACTRobert Brandom argues for a “pragmatic phenomenalist account” of knowledge. On this account, we should understand our notion of justification in accordance with a Sellarsian social practice model, and there is nothing more to the phenomenon of knowledge than the proprieties of takings-as-knowing. I agree with these two claims. But Brandom's proposal is so sketchy that it is unclear how it can deal with a number of much-discussed problems in contemporary epistemology. The main purpose of this article is to develop and defend a pragmatic phenomenalist account of knowledge by resolving those problems. I argue, in particular, that this account can accommodate both the lesson of the Gettier problem and the lesson of reliabilism simultaneously.


2014 ◽  
Vol 1 (1) ◽  
pp. 42
Author(s):  
Ahmad Amir Aziz

Apart from Kuhn and Popper, Lakatos has become an important figure in the<br />field of Philosophy of Science for his scientific theories, which he calls research<br />programmes. For Lakatos, Popper’s theoretical falsification can be immensely dangerous<br />when applied to the already established theories. On the other hand, in contrast to Kuhn<br />who assumed that a paradigm is by its nature immeasurable, Lakatos maintains that the<br />competing scientific discoveries may in fact be compared between one another. To him,<br />the main issues with regard to the logic of discovery cannot be dealt with satisfactorily<br />unless we do so within the framework of research programmes. The practical<br />implementation of this would be that the hard core of this framework cannot be subjected<br />to modification -let alone- rejection. This hard core must in other words be protected<br />from what he terms falsification. Lakatos also maintains that what can be said as scientific<br />is a series of theory, and not a single theory. This model of research programmes can in<br />fact be used in Islamic Studies in order to develop new theoretical principles that may<br />play a role of convincing protective-belt on the one hand, and to find new premises<br />whose discoveries can be used universally on the other


2020 ◽  
Author(s):  
John Turri

This chapter enhances and extends a powerful and promising research program, performance-based epistemology, which stands at the crossroads of many important currents that one can identify in contemporary epistemology, including the value problem, epistemic normativity, virtue epistemology, and the nature of knowledge. Performance-based epistemology offers at least three outstanding benefits: it explains the distinctive value that knowledge has, it places epistemic evaluation into a familiar and ubiquitous pattern of evaluation, and it solves the Gettier problem. But extant versions of performance-based epistemology have been the object of serious criticism. This chapter shows how to meet the objections without sacrificing the aforementioned benefits.


Author(s):  
Michael Hannon

This book is about knowledge and its value. At the heart of this book is a simple idea: we can answer many interesting and difficult questions in epistemology by reflecting on the role of epistemic evaluation in human life. Hannon calls this “function-first epistemology.” The core hypothesis is that the concept of knowledge is used to identify reliable informants. This practice is necessary, or at least deeply important, because it plays a vital role in human survival, cooperation, and flourishing. While this idea is quite simple, it has wide-reaching implications. Hannon uses it to cast new light on the nature and value of knowledge, the differences between knowledge and understanding, the relationship between knowledge, assertion, and practical reasoning, and the semantics of knowledge claims. This book also makes headway on some classic philosophical puzzles, including the Gettier problem, epistemic relativism, and philosophical skepticism. Hannon shows that some major issues in epistemology can be resolved by taking a function-first approach, thereby illustrating the significant role that this method can play in contemporary philosophy.


Erkenntnis ◽  
2021 ◽  
Author(s):  
Lewis Ross

AbstractThe notion of understanding occupies an increasingly prominent place in contemporary epistemology, philosophy of science, and moral theory. A central and ongoing debate about the nature of understanding is how it relates to the truth. In a series of influential contributions, Catherine Elgin has used a variety of familiar motivations for antirealism in philosophy of science to defend a non-factive theory of understanding. Key to her position are: (1) the fact that false theories can contribute to the upwards trajectory of scientific understanding, and (2) the essential role of inaccurate idealisations in scientific research. Using Elgin’s arguments as a foil, I show that a strictly factive theory of understanding has resources with which to offer a unified response to both the problem of idealisations and the role of false theories in the upwards trajectory of scientific understanding. Hence, strictly factive theories of understanding are viable notwithstanding these forceful criticisms.


1966 ◽  
Vol 15 (03/04) ◽  
pp. 519-538 ◽  
Author(s):  
J Levin ◽  
E Beck

SummaryThe role of intravascular coagulation in the production of the generalized Shwartzman phenomenon has been evaluated. The administration of endotoxin to animals prepared with Thorotrast results in activation of the coagulation mechanism with the resultant deposition of fibrinoid material in the renal glomeruli. Anticoagulation prevents alterations in the state of the coagulation system and inhibits development of the renal lesions. Platelets are not primarily involved. Platelet antiserum produces similar lesions in animals prepared with Thorotrast, but appears to do so in a manner which does not significantly involve intravascular coagulation.The production of adrenal cortical hemorrhage, comparable to that seen in the Waterhouse-Friderichsen syndrome, following the administration of endotoxin to animals that had previously received ACTH does not require intravascular coagulation and may not be a manifestation of the generalized Shwartzman phenomenon.


Author(s):  
Ronald Hoinski ◽  
Ronald Polansky

David Hoinski and Ronald Polansky’s “The Modern Aristotle: Michael Polanyi’s Search for Truth against Nihilism” shows how the general tendencies of contemporary philosophy of science disclose a return to the Aristotelian emphasis on both the formation of dispositions to know and the role of the mind in theoretical science. Focusing on a comparison of Michael Polanyi and Aristotle, Hoinski and Polansky investigate to what degree Aristotelian thought retains its purchase on reality in the face of the changes wrought by modern science. Polanyi’s approach relies on several Aristotelian assumptions, including the naturalness of the human desire to know, the institutional and personal basis for the accumulation of knowledge, and the endorsement of realism against objectivism. Hoinski and Polansky emphasize the promise of Polanyi’s neo-Aristotelian framework, which argues that science is won through reflection on reality.


Author(s):  
Liliane Campos

By decentring our reading of Hamlet, Stoppard’s tragicomedy questions the legitimacy of centres and of stable frames of reference. So Liliane Campos examines how Stoppard plays with the physical and cosmological models he finds in Hamlet, particularly those of the wheel and the compass, and gives a new scientific depth to the fear that time is ‘out of joint’. In both his play and his own film adaptation, Stoppard’s rewriting gives a 20th-century twist to these metaphors, through references to relativity, indeterminacy, and the role of the observer. When they refer to the uncontrollable wheels of their fate, his characters no longer describe the destruction of order, but uncertainty about which order is at work, whether heliocentric or geocentric, random or tragic. When they express their loss of bearings, they do so through the thought experiments of modern physics, from Galilean relativity to quantum uncertainty, drawing our attention to shifting frames of reference. Much like Schrödinger’s cat, Stoppard’s Rosencrantz and Guildenstern are both dead and alive. As we observe their predicament, Campos argues, we are placed in the paradoxical position of the observer in 20th-century physics, and constantly reminded that our time-specific relation to the canon inevitably determines our interpretation.


Sign in / Sign up

Export Citation Format

Share Document