generality problem
Recently Published Documents


TOTAL DOCUMENTS

42
(FIVE YEARS 10)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Andreas Stephens ◽  
Trond A. Tjøstheim ◽  
Maximilian K. Roszko ◽  
Erik J. Olsson

AbstractThe generality problem is commonly considered to be a critical difficulty for reliabilism. In this paper, we present a dynamical perspective on the problem in the spirit of naturalized epistemology. According to this outlook, it is worth investigating how token belief-forming processes instantiate specific types in the biological agent’s cognitive architecture (including other relevant embodied features) and background experience, consisting in the process of attractor-guided neural activation. While our discussion of the generality problem assigns “scientific types” to token processes, it represents a unified account in the sense that it incorporates contextual and common sense features emphasized by other authors.


2021 ◽  
Vol 1 (34) ◽  
pp. 27-51
Author(s):  
Hsueh Qu

Hume's epistemological legacy is often perceived as a predominantly negative sceptical one. His infamous problem of induction continues to perplex philosophers to this day, and many of his sceptical worries maintain their interest in contemporary eyes (e.g. with regard to reason, the senses, substance, causation). Yet Hume's positive epistemological contributions also hold significance for philosophy in this day and age. In this paper, I aim to situate Hume's epistemology in a more contemporary context, particularly with regard to the theme of reliabilism that runs throughout this epistemology. This will take the shape of examining correspondences and contrasts between Hume's epistemologies in the Treatise and Enquiry and reliabilism, as well as an examination of how Hume's framework might handle some major challenges for reliabilist epistemologies. In particular, I argue that that while Hume is tempted to an epistemology that is intimately tied to truth in the Treatise, he backs away when confronted with the excesses of scepticism in the conclusion of Book 1, and winds up with an epistemology most similar to the contemporary epistemological frameworks of dogmatism and phenomenal conservatism. Yet, largely because of his reliance on the passions (a respect in which he diverges from these two contemporary frameworks), the epistemology of the Treatise remains crucially dissociated from truth. Meanwhile, in the first Enquiry, he proceeds to develop a two-tiered epistemological framework that first accords all our justification with default authority, and then founds all-things-considered epistemic justification on our evidence for the reliability of our faculties. The first tier most resembles the contemporary epistemological framework of conservatism, while the second tier most closely resembles approved-list reliabilism. In this, a clear reliabilist thread runs through the epistemology of the Enquiry. I will also argue that although Hume did not appear to fully appreciate one of the most significant challenges for reliabilism-that is, the generality problem-his philosophical framework nevertheless contains the beginnings of a response to it.


Author(s):  
Juan Comesaña

Experientialism is compared and contrasted with Evidentialism, Reliabilism, and Evidentialist Reliabilism. The generality problem for Reliabilism is discussed, as well as the issue of how to measure reliability. A probabilistic understanding of reliability is put forward. In particular, reliability is understood in terms of evidential probabilities, not physical probabilities. An extension to credences is explored. Experientialism is non-Evidentialist insofar as it does not take experience to be evidence, and is non-Reliabilist insofar as it appeals to a normatively loaded notion of evidential probability.


2020 ◽  
Vol 11 (2) ◽  
pp. 231-236
Author(s):  
Frederik J. Andersen ◽  
Klemens Kappel ◽  

This paper aims to show that Selim Berker’s widely discussed prime number case is merely an instance of the well-known generality problem for process reliabilism and thus arguably not as interesting a case as one might have thought. Initially, Berker’s case is introduced and interpreted. Then the most recent response to the case from the literature is presented. Eventually, it is argued that Berker’s case is nothing but a straightforward consequence of the generality problem, i.e., the problematic aspect of the case for process reliabilism (if any) is already captured by the generality problem.


Diogenes ◽  
2019 ◽  
Vol 27 (2) ◽  
Author(s):  
Anna Ivanova ◽  
◽  
◽  

The article analyses the conflicting views of reliabilism and evidentialism on the following question – What is the leading condition in ascribing justification to beliefs: reliability or evaluation through evidence. The evidentialist view is defended by arguments, derived from the linguistic practices of ascribing justification in complex conditions. The generality problem is interpreted as an exemplification of the complexity of cognitive situations and it is argued that complexity requires reference to the mental states as a means for ascribing justification. Reliability is also recognized as a factor for ascribing justification to some beliefs when it serves as a type of evidence.


2019 ◽  
Vol 128 (4) ◽  
pp. 463-509 ◽  
Author(s):  
Jack C. Lyons

The paper offers a solution to the generality problem for a reliabilist epistemology, by developing an “algorithm and parameters” scheme for type-individuating cognitive processes. Algorithms are detailed procedures for mapping inputs to outputs. Parameters are psychological variables that systematically affect processing. The relevant process type for a given token is given by the complete algorithmic characterization of the token, along with the values of all the causally relevant parameters. The typing that results is far removed from the typings of folk psychology, and from much of the epistemology literature. But it is principled and empirically grounded, and shows good prospects for yielding the desired epistemological verdicts. The paper articulates and elaborates the theory, drawing out some of its consequences. Toward the end, the fleshed-out theory is applied to two important case studies: hallucination and cognitive penetration of perception.


Episteme ◽  
2019 ◽  
pp. 1-15
Author(s):  
Jeffrey Tolly

AbstractThe generality problem is one of the most pressing challenges for process reliabilism about justification. Thus far, one of the more promising responses is James Beebe's tri-level statistical solution. Despite the initial plausibility of Beebe's approach, the tri-level statistical solution has been shown to generate implausible justification verdicts on a variety of cases. Recently, Samuel Kampa has offered a new statistical solution to the generality problem, which he argues can overcome the challenges that undermined Beebe's original statistical solution. However, there's good reason to believe that Kampa is mistaken. In this paper, I show that Kampa's new statistical solution faces problems that are no less serious than the original objections to Beebe's solution. Depending on how we interpret Kampa's proposal, the new statistical solution either types belief-forming processes far too narrowly, or the new statistical solution fails to clarify the epistemic implications of reliabilism altogether. Either way, the new statistical solution fails to make substantive progress towards solving the generality problem.


Theoria ◽  
2019 ◽  
Vol 85 (2) ◽  
pp. 119-144
Author(s):  
Erhan Demircioglu

Sign in / Sign up

Export Citation Format

Share Document