scholarly journals Implicitness, Logical Form and Arguments

2021 ◽  
Vol 21 (63) ◽  
pp. 405-418
Author(s):  
Martina Blečić

In the paper I suggest that a loose notion of logical form can be a useful tool for the understanding or evaluation of everyday language and the explicit and implicit content of communication. Reconciling ordinary language and logic provides formal guidelines for rational communication, giving strength and order to ordinary communication and content to logical schemas. The starting point of the paper is the idea that the bearers of logical form are not natural language sentences, but what we communicate with them, that is, their content in a particular context. On the basis of that idea, I propose that we can ascribe logical proprieties to what is communicated using ordinary language and suggest a continuum between semantic phenomena such as explicatures and pragmatic communicational strategies such as (particularized) conversational implicatures, which challenges the idea that an implicatum is completely separate from what is said. I believe that this continuum can be best explained by the notion of logical form, taken as a propriety of sentences relative to particular interpretations.

Dialogue ◽  
1992 ◽  
Vol 31 (4) ◽  
pp. 677-684 ◽  
Author(s):  
Bernard Linsky

Stephen Neale defends Russell's famous theory of definite descriptions against more than 40 years' worth of criticisms beginning long before Strawson's “On Referring.” Ever since Strawson's parting shot in that paper (“… ordinary language has no exact logic”), the theory of descriptions has been a battleground for the larger issue of whether a systematic theory of the semantics of natural language is really possible. Neale provides us with a sketch of part of that project as it currently stands. All of the complexities and irregularities of the use of definite descriptions in natural language can be combined, after all, in a single theory based on an “exact logic.” Neale argues that one can give a Russellian account of “incomplete descriptions” (as in ‘The table is covered with books’), generic uses of ‘the’ (‘The whale is a mammal’), plural descriptions (‘The men carried the piano’) and, of central interest, the purportedly referential uses identified by Donnellan (as in ‘The murderer of Smith is insane’ when it is Jones the accused we have in mind). Neale follows familiar answers to these objections; incorporate demonstratives into the account (to get ‘The table over there …’), distinguish the proposition expressed from the one meant (the “referential” use is what was meant not said), and point out that the problem is not unique to definite descriptions and so cannot be a fault of any particular theory of them (many expressions have generic, plural and “referential” uses).


2021 ◽  
Author(s):  
Roberta Colonna Dahlman

AbstractAccording to Grice’s analysis, conversational implicatures are carried by the saying of what is said (Grice 1989: 39). In this paper, it is argued that, whenever a speaker implicates a content by flouting one or several maxims, her implicature is not only carried by the act of saying what is said and the way of saying it, but also by the act of non-saying what should have been said according to what would have been normal to say in that particular context. Implicatures that arise without maxim violation are only built on the saying of what is said, while those that arise in violative contexts are carried by the saying of what is said in combination with the non-saying of what should have been said. This observation seems to justify two claims: (i) that conversational implicatures have different epistemic requirements depending on whether they arise in violative or non-violative contexts; (ii) that implicatures arising in non-violative contexts are more strongly tied to their generating assertion than those arising with maxim violation.


2020 ◽  
Vol 9 ◽  
Author(s):  
Paul Livingston

Working through Balaska’s deeply perceptive, elegantly written, and profoundly honest book, Wittgenstein and Lacan at the Limit, a reader steeped in the recent academic literature about either or both of its main figures may come to feel herself placed at what is, itself, a certain kind of limit.  The limit I mean is the limit of a familiar type of theoretical discourse about the constitution and structure of language and subjectivity as Wittgenstein and Lacan treat them: it includes the discourses that seek, for instance, to articulate how language and sense are constituted in the Tractatus, and thus what is really meant by “logical form” and “nonsense” there; or those that aim to comprehend the true relationship of our biological nature to language, culture, and the advent of freedom in Lacan; or, again, those that try to find, in either thinker’s works (or both), the precise location of the delicate logical buttonhole that would alone permit us entry, from within everyday language and life, to the absoluteness of an ineffable beyond.   These discourses all treat of language and life, but handle these phenomena (so we might say) at arm’s length, theorizing the structure of each and the form of their relationship in such a way as to establish, ultimately, their mutual convertibility to one another, their mutual absorption into a third, more inclusive term (such as “nature” or “biology), or adduce translations from the dense theoretical matrices of one thinker’s treatment of them to the other’s (for instance, from the terminology of logic to that of psychoanalysis, or back again).  Balaska’s book, doing none of these things, rather succeeds in bringing out how an interconnected reading of the Wittgenstein of the Tractatus and Lacan may speak to and inform our response to a certain kind of experience that is characteristic for both thinkers, and typical as well of those moments and occasions of our lives in which we may find ourselves drawn to reflect on what meaning is and how we relate to it. 


2019 ◽  
Vol 17 (2) ◽  
pp. 174-188
Author(s):  
A. V. Kosarev

After the rhetoric has lost its disciplinary specifics, in the XX century there was a wave of renewed interest in it, expressed in the development of the study of argumentation as an independent field of knowledge. The origin of the rhetorical field in the theory of argumentation was initiated by Ch. Perelman. He rejects the strict logical form of the construction of the argument, since it does not take into account the goals, conditions, means and context of the argument. He examines argumentation as a process of interaction between the orator and the audience, and identifies and analyzes techniques that lead to conviction as a result. The main task of Ch. Perelman consists in improvement of the communicative practices in the society by justifying the indissoluble unity of the concepts of the audience and argumentation. The specific features of the rhetorical theory of argumentation consist in the concepts of argumentation as a unified network of arguments, a new understanding of the audience and its typology, a shift in the assessment of the quality of public communication from the orator to the audience, the concept of the starting point of the argument and the value of argument.


Author(s):  
Mary F. Scudder

Chapter 4 proposes a listening-centered alternative to empathy-based approaches to deliberation. The chapter begins by discussing how the concept of listening is used in everyday language and then introduces a more systematic “theory of listening acts.” Using the categories of speech act theory to identify corresponding categories of the listening act, the author distinguishes between auditory, perauditory, and ilauditory listening. With this listening act theory, the author shows that listening is more than simply hearing what is said (auditory listening). Similarly, listening should not be equated with the outcomes it brings about, including consensus or mutual understanding (perauditory listening outcomes). We also act in listening (ilauditory listening). In listening to our fellow citizens we enact the deliberative ideal, acknowledging that their perspectives are relevant to our collective judgements and decisions. The chapter shows that fair consideration is predicated on ilauditory listening, or what the author calls “performative democratic listening.”


Author(s):  
Oskari Kuusela

In the Introduction I made the bold claim that Wittgenstein transforms Frege’s and Russell’s logical and methodological ideas in a way that ‘can be justifiably described as a second revolution in philosophical methodology and the philosophy of logic, following Frege’s and Russell’s first revolution’. This claim was meant in a specific sense relating to the use of logical methods in philosophy, a discipline where we are often dealing with complex and messy concepts and phenomena, and having to clarify highly complicated and fluid uses of natural language. The situation is not quite the same in metamathematics, for example, and my claim was not intended to concern the employment of logical methods there, i.e. that Wittgenstein’s later philosophy of logic would constitute a revolution in this area too. For, while his later philosophy of logic has no difficulty explaining the possibility of the employment of calculi to clarify other calculi, in metamathematics there is perhaps no similarly pressing need for idealization as in philosophy, when we clarify complex concepts originating in ordinary language, since the targets of clarification in metamathematics are systems governed by strict rules themselves. Thus, this area of the employment of logical methods seems not as significantly affected. But I hope that my claim concerning the use of logical methods in philosophy can now be recognized as justified, or at least worth considering seriously, on the basis of what I have said about 1) the later Wittgenstein’s account of the status of logical clarificatory models, and how this explains the possibility of simple and exact logical descriptions, thus safeguarding the rigour of logic, 2) how his account of the function of logical models makes possible the recognition of the relevance of natural history for logic without compromising the non-empirical character of the discipline of logic, and 3) in the light of Wittgenstein’s introduction of new non-calculus-based logical methods for the purpose of philosophical clarification, such as his methods of grammatical rules, the method of language-games, and quasi-ethnology....


2021 ◽  
pp. 153-168
Author(s):  
Una Stojnić

This chapter develops a formal model of context-sensitivity of modal discourse. Much like demonstrative pronouns, modals are prominence-sensitive, selecting the most prominent candidate interpretation. The prominence ranking of candidate interpretations is recorded in the conversational record, and is maintained through the effects of discourse conventions represented in the logical form of a discourse. In this way arguments are individuated as structured discourses that underwrite a particular propositional pattern. It is shown that such account provably preserves classical logic. Further, this chapter argues that its model offers a more satisfactory account of the individuation of argument patterns in natural language discourse then the competing alternatives. Any adequate account, it is here argued, will have to take into account not just the contribution of individual sentences, but also of discourse conventions. Indeed, the contribution of discourse conventions is crucial for determining the contribution of individual sentences in the first place.


Author(s):  
Stephen Neale

Syntax (more loosely, ‘grammar’) is the study of the properties of expressions that distinguish them as members of different linguistic categories, and ‘well-formedness’, that is, the ways in which expressions belonging to these categories may be combined to form larger units. Typical syntactic categories include noun, verb and sentence. Syntactic properties have played an important role not only in the study of ‘natural’ languages (such as English or Urdu) but also in the study of logic and computation. For example, in symbolic logic, classes of well-formed formulas are specified without mentioning what formulas (or their parts) mean, or whether they are true or false; similarly, the operations of a computer can be fruitfully specified using only syntactic properties, a fact that has a bearing on the viability of computational theories of mind. The study of the syntax of natural language has taken on significance for philosophy in the twentieth century, partly because of the suspicion, voiced by Russell, Wittgenstein and the logical positivists, that philosophical problems often turned on misunderstandings of syntax (or the closely related notion of ‘logical form’). Moreover, an idea that has been fruitfully developed since the pioneering work of Frege is that a proper understanding of syntax offers an important basis for any understanding of semantics, since the meaning of a complex expression is compositional, that is, built up from the meanings of its parts as determined by syntax. In the mid-twentieth century, philosophical interest in the systematic study of the syntax of natural language was heightened by Noam Chomsky’s work on the nature of syntactic rules and on the innateness of mental structures specific to the acquisition (or growth) of grammatical knowledge. This work formalized traditional work on grammatical categories within an approach to the theory of computability, and also revived proposals of traditional philosophical rationalists that many twentieth-century empiricists had regarded as bankrupt. Chomskian theories of grammar have become the focus of most contemporary work on syntax.


Author(s):  
Reinhard Muskens

Type-logical semantics studies linguistic meaning with the help of the theory of types. The latter originated with Russell as an answer to the paradoxes, but has the additional virtue that it is very close to ordinary language. In fact, type theory is so much more similar to language than predicate logic is, that adopting it as a vehicle of representation can overcome the mismatches between grammatical form and predicate logical form that were observed by Frege and Russell. The grammatical forms of ordinary language sentences consequently may be taken to be much less misleading than logicians in the first half of the twentieth century often thought them to be. This was realized by Richard Montague, who used the theory of types to translate fragments of ordinary language into a logical language. Semantics is commonly divided into lexical semantics, which studies the meaning of words, and compositional semantics, which studies the way in which complex phrases obtain a meaning from their constituents. The strength of type-logical semantics lies with the latter, but type-logical theories can be combined with many competing hypotheses about lexical meaning, provided these hypotheses are expressed using the language of type theory.


Sign in / Sign up

Export Citation Format

Share Document