Locating meanings

Author(s):  
Paul M. Pietroski

This chapter characterizes meanings in terms of certain generative procedures. We can begin to locate the natural phenomenon of linguistic meaning by focusing on (Chomsky-style) examples of constrained homophony. Two or more lexical items can connect distinct meanings with the same pronunciation; and phrases like ‘ready to please’ are similarly homophonous. But as ‘eager to please’ and ‘easy to please’ illustrate, phrasal homophony is constrained. Such facts provide important clues about what meanings are, and how they can(not) be combined. The details provide reasons for identifying the languages that children naturally acquire with biologically implemented procedures, and not sets of expressions. There are English procedures; but English is not a thing that speakers share and use to communicate. In this context, some initial reasons are given for doubting that the relevant procedures generate sentences that have truth conditions.

Author(s):  
John Collins

This chapter articulates and defends linguistic pragmatism as a linguistic hypothesis that language alone underdetermines truth conditions (or what is said), and doesn’t even provide a variable licence for the truth conditions of an utterance in a context. Linguistic meaning is characterized, therefore, in terms of constraints upon what can be literally said with a linguistic structure, without the presumption that the linguistic properties of an utterance in a context will determine a content. The hypothesis is explained in terms of the resources language makes available to content, differentiated from related positions, and defended against numerous objections, especially those that argue for an essential role for minimal propositions in accounting for aspects of what is said.


2010 ◽  
Vol 11 (2) ◽  
pp. 150-167 ◽  
Author(s):  
Alberto Voltolini ◽  

There is definitely a family resemblance between what contemporary contextualism maintains in philosophy of language and some of the claims about meaning put forward by the later Wittgenstein. Yet the main contextualist thesis, namely that linguistic meaning undermines truth-conditions, was not defended by Wittgenstein. If a claim in this regard can be retrieved in Wittgenstein despite his manifest antitheoretical attitude, it is instead that truth-conditions trivially supervene on linguistic meaning. There is, however, another Wittgensteinian claim that truly has a contextualist flavour, namely that linguistic meaning is itself wide-contextual. To be sure, this claim does not lead to the eliminativist/intentionalist conception of linguistic meaning that radical contextualists have recently developed. Rather, it goes together with a robust conception of linguistic meaning as intrinsically normative. Yet it may explain why Wittgenstein is taken to be a forerunner of contemporary contextualism.


2012 ◽  
Vol 20 (1) ◽  
pp. 62-87
Author(s):  
Richard Vallée

“Imported” is a member of a large family of adjectives, including “enemy”, “domestic”, “local”, “exported”, “foreign”. Call these terms contextuals. Contextuals are prima facie context-sensitive expressions in that the same contextual sentence can have different truth-values, and hence different truth-conditions, from utterance to utterance. I use Perry’s multipropositionalist framework to get a new angle on contextuals. I explore the idea that the lexical linguistic meaning of contextual adjectives introduces two conditions to the cognitive significance of an utterance. These conditions contain a variable, y, that does not correspond to any lexical component in the sentence. This is the available tool for letting the speakers’ intentions, or what the speakers have in mind, play a semantic role. My view focuses on the complex condition that linguistic meaning (as type) sometimes semantically determines.


Author(s):  
John Collins

Linguistic pragmatism claims that what we literally say goes characteristically beyond what the linguistic properties themselves mandate. In this book, John Collins provides a novel defence of this doctrine, arguing that linguistic meaning alone fails to fix truth conditions. While this position is supported by a range of theorists, Collins shows that it naturally follows from a syntactic thesis concerning the relative sparseness of what language alone can provide to semantic interpretation. Language–and by extension meaning–provides constraints upon what a speaker can literally say, but does not characteristically encode any definite thing to say. Collins then defends this doctrine against a range of alternatives and objections, focusing in particular on an analysis of weather reports: ‘it is raining/snowing/sunny’. Such reporting is mostly location-sensitive in the sense that the utterance is true or not depending upon whether it is raining/snowing/sunny at the location of the utterance, rather than some other location. Collins offers a full analysis of the syntax, semantics, and pragmatics of weather reports, including many novel data. He shows that the constructions lack the linguistic resources to support the common literal locative readings. Other related phenomena are discussed such as the Saxon genitive, colour predication, quantifier domain restriction, and object deletion.


Author(s):  
Paul M. Pietroski ◽  
Bradley Armour-Garb

This chapter argues that liar sentences reveal a fundamental problem for the project of characterizing linguistic meaning in terms of truth. It further argues that weak-logic solutions to the Foster problem for Davidsonian theories are exacerbated by the Liar. According to the chapter, liar sentences have no truth conditions, and any theory that has its instances of the T-schema as a theorem is just false. The author urges that liar sentences illustrate a deep difficulty for truth-theoretic conceptions of meaning for Human Languages and that we should find a different conception of meaning according to which expressions of Human Languages—I-Languages—are not among the truth-evaluable things.


2019 ◽  
pp. 141-165
Author(s):  
Mark Richard

This chapter continues the discussion of propositions and propositional attitudes begun in Chapters 3 and 4. Section 1 sketches a view of attitudes and attitude ascription. Section 2 addresses how truth conditions and linguistic meaning do and do not help to individuate ‘the objects of the attitudes’. Section 3 returns to the last chapter’s discussion of how the reference of another’s words or concepts bears on the truth of an ascription of saying or thought to her.


10.29007/s2m4 ◽  
2018 ◽  
Author(s):  
Oleg Kiselyov

We present the grammar/semantic formalism of Applicative AbstractCategorial Grammar (AACG), based on the recent techniques fromfunctional programming: applicative functors, staged languages andtyped final language embeddings. AACG is a generalization ofAbstract Categorial Grammars (ACG), retaining the benefits of ACG as agrammar formalism and making it possible and convenient to express avariety of semantic theories.We use the AACG formalism to uniformly formulate Potts' analyses ofexpressives, the dynamic-logic account of anaphora, and thecontinuation tower treatment of quantifier strength, quantifierambiguity and scope islands. Carrying out these analyses in ACGrequired compromises and the ballooning of parsing complexity, or wasnot possible at all. The AACG formalism brings modularity, which comesfrom the compositionality of applicative functors, in contrast tomonads, and the extensibility of the typed final embedding. Theseparately developed analyses of expressives and QNP are used as theyare to compute truth conditions of sentences with both these features.AACG is implemented as a `semantic calculator', which is the ordinaryHaskell interpreter. The calculator lets us interactively writegrammar derivations in a linguist-readable form and see their yields,inferred types and computed truth conditions. We easily extendfragments with more lexical items and operators, and experiment withdifferent semantic-mapping assemblies. The mechanization lets asemanticist test more and more complex examples, making empirical testsof a semantic theory more extensive, organized and systematic.


Author(s):  
Pascal Engel

A sentence is a string of words formed according to the syntactic rules of a language. But a sentence has semantic as well as syntactic properties: the words and the whole sentence have meaning. Philosophers have tended to focus on the semantic properties of indicative sentences, in particular on their being true or false. They have called the meanings of such sentences ‘propositions’, and have tied the notion of proposition to the truth-conditions of the associated sentence. The term ‘proposition’ is sometimes assimilated to the sentence itself; sometimes to the linguistic meaning of a sentence; sometimes to ‘what is said’; sometimes to the contents of beliefs or other ‘propositional’ attitudes. But however propositions are defined, they must have two features: the capacity to be true or false; and compositional structure (being composed of elements which determine their semantic properties).


Sign in / Sign up

Export Citation Format

Share Document