The Use and Abuse of Imagination: A Reply to Samuel A. Bleicher

1971 ◽  
Vol 25 (4) ◽  
pp. 953-957
Author(s):  
Michael D. Wallace ◽  
J. David Singer

It has often been observed that recent converts to any belief system tend to be among its most zealous adherents, and science (despite its emphasis on objectivity and detachment) has proved no exception. As the canons of scientific inquiry begin to take hold in each field of human knowledge, there have appeared those who seem, as it were, more royalist than the king. For these scholars the rules of scientific inference are not guidelines to be used with care but dogmas to be pursued unswervingly; to them science is not, as someone once expressed it, “attenuated common sense” but a totally different and rather severe regimen of thought.

Author(s):  
JOHN BELL ◽  
ZHISHENG HUANG

In this paper we present a formal common sense theory of the adoption of perception-based beliefs. We begin with a logical analysis of perception and then consider when perception should lead to belief change. Our theory is intended to apply to perception in humans and to perception in artificial agents at the level of the symbolic interface between a vision system and a belief system. In order to provide a context for our work we relate it to the emerging field of cognitive robotics, give an abstract architecture for an agent which is both embodied and capable of reasoning, and relate this to the concrete architectures of two vision-based surveillance systems.


2001 ◽  
Vol 18 (2) ◽  
pp. 177-217
Author(s):  
David Sidorsky

The search for moral objectivity has been constant throughout the history of philosophy, although interpretations of the nature and scope of objectivity have varied. One aim of the pursuit of moral objectivity has been the demonstration of what may be termed its epistemological thesis, that is, the claim that the truth of assertions of the goodness or rightness of moral acts is as legitimate, reliable, or valid as the truth of assertions involving other forms of human knowledge, such as common sense, practical expertise, science, or mathematics. Another aim of the quest for moral objectivity may be termed its pragmatic formulation; this refers to the development of a method or procedure that will mediate among conflicting moral views in order to realize a convergence or justified agreement about warranted or true moral conclusions. In the ethical theories of Aristotle, David Hume, and John Dewey, theories that represent three of the four variants of ethical naturalism (defined below) that are surveyed in this essay, the epistemological thesis and the pragmatic formulation are integrated or combined. The distinction between these two elements is significant for the present essay, however, since I want to show that linguistic naturalism, the fourth variant I shall examine, has provided a demonstration of the epistemological thesis about moral knowledge, even if the pragmatic formulation has not been successfully realized.


2016 ◽  
Vol 7 (1) ◽  
pp. 184-198
Author(s):  
Anna Malitowska ◽  
Mateusz Bonecki

This paper focuses on analysis of relation between pedagogical and epistemological ideas of John Dewey. Our considerations are divided into four sections. (1) We reconstruct Dewey’s conception of culture as a body of normative and regulative common sense beliefs determining human conduct and language use. (2) Further, we compare common sense based inquiry and its scientific mode with regard to their respective conceptual frameworks in order to show that “theoretical-scientific” perspective provides more comprehensive insight into the relations constituting problem situations. (3) We identify informal education with socialization processes and argue that educational process relies on constant reflection on cultural habits. (4) We conclude that competences of using theoretical conceptual frameworks and conducting scientific inquiry play crucial role in Dewey’s educational ideology of progressivism since they provide basic tools for critical reconsideration and revision of common sense beliefs.


Author(s):  
Ian Tipton

George Berkeley, who was born in Ireland and who eventually became Bishop of Cloyne, is best known for three works that he published while still very young: An Essay towards a New Theory of Vision (1709), Three Dialogues between Hylas and Philonous (1713), and in particular for A Treatise concerning the Principles of Human Knowledge (1710). In the Principles he argues for the striking claim that there is no external, material world; that houses, trees and the like are simply collections of ‘ideas’; and that it is God who produces ‘ideas’ or ‘sensations’ in our minds. The New Theory of Vision had gone some way towards preparing the ground for this claim (although that work has interest and value in its own right), and the Dialogues represent Berkeley’s second attempt to defend it. Other works were to follow, including De Motu (1721), Alciphron (1732) and Siris (1744), but the three early works established Berkeley as one of the major figures in the history of modern philosophy. The basic thesis was certainly striking, and from the start many were tempted to dismiss it outright as so outrageous that even Berkeley himself could not have taken it seriously. In fact, however, Berkeley was very serious, and certainly a very able philosopher. Writing at a time when rapid developments in science appeared to be offering the key to understanding the true nature of the material world and its operations, but when scepticism about the very existence of the material world was also on the philosophical agenda, Berkeley believed that ‘immaterialism’ offered the only hope of defeating scepticism and of understanding the status of scientific explanations. Nor would he accept that his denial of ‘matter’ was outrageous. Indeed, he held that, if properly understood, he would be seen as defending the views of ‘the vulgar’ or ‘the Mob’ against other philosophers, including Locke, whose views posed a threat to much that we would ordinarily take to be common sense. His metaphysics cannot be understood unless we see clearly how he could put this interpretation on it; and neither will we do it justice if we simply dismiss the role he gives to God as emerging from the piety of a future bishop. Religion was under threat; Berkeley can probably be judged prescient in seeing how attractive atheism could become, given the scientific revolution of which we are the heirs; and though it could hardly be claimed that his attempts to ward off the challenge were successful, they merit respectful attention. Whether, however, we see him as the proponent of a fascinating metaphysics about which we must make up our own minds, or as representing merely one stage in the philosophical debate that takes us from Descartes to Locke and then to Hume, Kant and beyond, we must recognize Berkeley as a powerful intellect who had an important contribution to make.


2014 ◽  
Vol 4 (1) ◽  
pp. 21-35
Author(s):  
Stephen Maitzen

For at least several decades, and arguably since the time of Descartes, it has been fashionable to offer scientific or quasi-scientific arguments for skepticism about human knowledge. I critique five attempts to argue for skeptical conclusions from the findings of science and scientifically informed common sense.


2004 ◽  
Vol 20 (2) ◽  
pp. 166-187 ◽  
Author(s):  
ZhaoHong Han

The construct of the native speaker is germane to second language acquisition (SLA) research; it underlies, and permeates, a significant bulk of SLA theory construction and empirical research. Nevertheless, it is one of the least investigated (and for that matter, least understood) concepts in the field. Even a cursory reading of the major SLA literature would not yield one readily available definition that captures the essential uses that have been made of the concept: including, but not limited to, setting the native speaker as a goal or a model for SLA or using the native speaker as a yardstick to measure second language knowledge. As is, the concept remains assumed-based on common sense observation and intuition-rather than exposed to scientific inquiry. In this article I would like to draw attention to this pivotal yet much neglected concept by reviewing Davies (1991; 2003) on the native speaker. A by no means exhaustive account, the books outline principal parameters for considering the native speaker concept, thereby providing a useful basis for further inquiry.


Philosophy ◽  
2002 ◽  
Vol 77 (3) ◽  
pp. 349-373 ◽  
Author(s):  
Wai-hung Wong
Keyword(s):  

Insulation is a noticeable phenomenon in the case of most non-Pyrrhonian sceptics about human knowledge. A sceptic is experiencing insulation when his scepticism does not have any effect on his common sense beliefs, and his common sense beliefs do not have any effect on his scepticism. I try to show why this is a puzzling phenomenon, and how it can be explained. It is puzzling because insulation seems to require blindness to one's own epistemic irresponsibility and irrationality, while the sceptic presumably cares a lot about being epistemically responsible and rational. Insulation can be explained by means of a notion of philosophical detachment: to be detached from one's own beliefs about the world is to take an other-personal position towards those beliefs, treating them as if they are another person's beliefs. It is because of this that the sceptic's scepticism is insulated from his scepticism because he cannot be detached from his beliefs about the world when he is engaging in everyday, practical activities. I conclude the paper with a brief discussion of the generality of the problem of insulation.


1981 ◽  
Vol 4 (2) ◽  
pp. 67-90 ◽  
Author(s):  
Haldur Õim

The paper touches upon the relations between grammar-oriented and communication-oriented ways of analysing language in the light of some recent related developments in theoretical linguistics, congnitive psychology and artificial intelligence. The integration of (linguistic) meaning, situational interpretation and background knowledge is discussed. In Particular, different types of linguistically relevant common-sense knowledge and higher level cognitive procedures are considered.


Author(s):  
Joyce Appleby ◽  
Elizabeth Covington ◽  
David Hoyt ◽  
Michael Latham ◽  
Allison Sneider

I argued in my previous paper that the opinions (1) that all inference beyond the immediate data of experience is meaningless, and (2) that the whole of scientific knowledge can be established independently of experience, are both logically tenable, at a price, but that neither corresponds with ordinary scientific or common-sense belief. It was obvious that Fisher would be the first to agree with me in rejecting the second alternative; his attitude to the first was less clear. I also maintained that we need a theory of scientific inference that will agree with ordinary beliefs about its validity, and that any such theory would involve as an a priori element the notion of probability and some of the fundamental rules for its assessment. By their very nature these rules cannot be established by experience: they must be judged by their plausibility, the internal consistency of the theory based on them, and the agreement or otherwise of the results with general belief. Fisher objects to the introduction of an a priori element, and I should agree with him to the extent that a priori hypotheses should be reduced to a minimum, but that minimum must be sufficient to give a general theory. I was originally somewhat attracted by the wish to define probability in terms of frequency, but found that the existing theory of Venn failed in its objects. It avoided no a priori hypothesis, several having been used but not stated, and its results, when interpreted in terms of the definition, were not in a practically applicable form. As the arguments have already been published twice, I do not repeat them. Fisher departs from Venn by defining a probability as the ratio of two infinite numbers; but then no probability would have a definite value. Later in this paper, however, he obtains definite values for probabilities, and it is not clear how he gets them. (At this stage he generally uses the word “frequency” in place of “probability,” but I think he is treating them as synonymous.) There is a gap in his argument at this point; but the results, relating to the probability of a set of observations given the hypothesis, agree in all cases with those of the a priori theory, on the supposition, presumably valid, that the probability of any particular observation is determined by the constants of the assumed law of distribution alone, and is not disturbed by the previous observations.


Sign in / Sign up

Export Citation Format

Share Document