The Putnamian Argument (the Argument from the Rejection of Global Skepticism) [also, (O) The Argument from Reference, and (K) The Argument from the Confluence of Proper Function and Reliability]

Author(s):  
Evan Fales

A familiar story about reference that developed in the 1970s appeared to offer a light at the end of the tunnel of positivist theories of meaning. Coherence theories of truth and justification, paradigm shifts, and incommensurability stalked the land. Causal theories of reference (Kripke 1980) promised to change all that and restore scientific realism. But another dialectic took hold that led to what Putnam called internal realism. This chapter aims to rescue Putnam from internal realism, and to breathe new life into real realism. It also aims to rescue science from Plantinga’s argument N (and arguments O and K). That will allow an answer to perhaps the strongest link in the chain of arguments that Naturalism is epistemically self-defeating. The chapter offers a diagnosis of the central difficulty that appears to wreak havoc with the realist aspirations of causal theories of reference. Finally, a cure is offered for that malady.

Author(s):  
Paul M. Pietroski

This chapter summarizes the main themes. Humans naturally acquire generative procedures that connect meanings with pronunciations. These meanings are neither concepts nor extensions. Meanings are composable instructions for how to access and assemble concepts of a special sort. In particular, phrasal meanings are instructions for how to build monadic (i.e., predicative) concepts that are massively conjunctive. Theories of meaning should not be confused with theories of truth. Lexicalization is a process of introducing concepts that can be combined via simple operations whose inputs must be monadic or dyadic. In theorizing about meanings, we can and should eschew much of the powerful typology and combinatorial operations that the founders of modern logic introduced for very different purposes.


Author(s):  
Paul M. Pietroski

This chapter and the next argue against the idea that children acquire languages whose sentences have compositionally determined truth conditions. The chapter begins by discussing Davidson’s bold conjecture: the languages that children naturally acquire support Tarski-style theories of truth, which can serve as the core components of meaning theories for the languages in question. The argument is that even if there are plausible theories of truth for these languages, formulating them as plausible theories of meaning requires assumptions about truth that are extremely implausible. Sentences like ‘My favorite sentence is not true’, which happens to be my favorite sentence, illustrate this point. But the point is not merely that “Liar Sentences” are troublesome, it is that theories of truth and theories of meaning have different subject matters.


2021 ◽  
pp. 57-86
Author(s):  
Kenneth A. Taylor

The “jazz combo theory” captures the common spirit of various theories that reject reference and the “bottom up” approach to the problem of objective representational content. We can imagine the members of a jazz combo initially playing together without any shared musical norms. But they continually adjust to one another until norms emerge and are mutually endorsed. Players start holding one another to these norms, and it’s this that gives the sounds they produce—what would otherwise be mere noise—determinate musical content. Similarly, on the jazz combo theory, what would otherwise be productions of meaningless strings by language users, come to constitute determinate linguistic acts with determinate propositional contents, by virtue of the users adopting, and holding one another to, a shared set of linguistic and discursive norms. This chapter argues that jazz combo theorists overstate the case against reference, although they’re right in stressing the importance of norms and their dependence on social interaction. Jazz combo theorists tend to reject bottom-up approaches, including causal theories, because they take those approaches to be incompatible with the explanatory priority of the sentence and to fail to bridge the supposed gap between cause and norm. A number of conceptual tools are introduced to counter their arguments and to defend the consistency of the dynamic priority of the sentence, the syntactic correlativity of sentences and their constituents, and the semantic priority of constituents.


1994 ◽  
Vol 24 (1) ◽  
pp. 95-118 ◽  
Author(s):  
Paul M. Pietroski

In a recent paper, Bar-On and Risjord (henceforth, ‘B&R’) contend that Davidson provides no good argument for his (in)famous claim that ‘there is no such thing as a language.’ And according to B&R, if Davidson had established his ‘no language’ thesis, he would thereby have provided a decisive reason for abandoning the project he has long advocated — viz., that of trying to provide theories of meaning for natural languages by providing recursive theories of truth for such languages. For he would have shown that there are no languages to provide truth (or meaning) theories of. Davidson thus seems to be in the odd position of arguing badly for a claim that would undermine his own work.


Author(s):  
Alexander Sanzhenakov

The article is devoted to the consideration of the possibility of applying the methodological principles of scientific realism in the history of ancient philosophy. The author shows that in its strong version, scientific realism is not an appropriate basis for historical research, since it involves minimizing the number of interpretations of philosophical material of the past. Another serious drawback of applying strong versions of scientific realism in the history of philosophy is their focus on the correspondent theory of truth. This theory does not fit the historian of philosophy, since she aims not only at creating a realistic picture of the past, but also at incorporating the philosophical ideas of the past into the modern context, therefore a coherent theory of truth is more likely to meet her objectives. After a brief review of the weak versions of realism (H. Putnam’s “internal realism”, S. Blackburn’s “quasi-realism” and “sensibility theory”), the author concludes that these kinds of realism are more suitable for the history of philosophy in general and for the history of ancient philosophy in particular. As a result, the author concludes that the historian of philosophy must take into account the objectivity and independence of the philosophical ideas of the past, and inevitably be guided by his own conceptual and terminological facilities in order to incorporate the ideas of the past into the modern philosophical context.


2016 ◽  
Vol 1 (13) ◽  
pp. 104-112
Author(s):  
Karen A. Ball ◽  
Luis F. Riquelme

A graduate-level course in dysphagia is an integral part of the graduate curriculum in speech-language pathology. There are many challenges to meeting the needs of current graduate student clinicians, thus requiring the instructor to explore alternatives. These challenges, suggested paradigm shifts, and potential available solutions are explored. Current trends, lack of evidence for current methods, and the variety of approaches to teaching the dysphagia course are presented.


2010 ◽  
pp. 115-132 ◽  
Author(s):  
S. Agibalov ◽  
A. Kokorin

Copenhagen summit results could be called a failure. This is the failure of UN climate change policy management, but definitely the first step to a new order as well. The article reviews main characteristics of climate policy paradigm shifts. Russian interests in climate change policy and main threats are analyzed. Successful development and implementation of energy savings and energy efficiency policy are necessary and would sufficiently help solving the global climate change problem.


Sign in / Sign up

Export Citation Format

Share Document