scholarly journals Conceptos filosóficos que influyeron en las paradojas de conjuntos descubiertas por Bertrand Russell

Author(s):  
Felipe Maya Restrepo
Keyword(s):  
A Priori ◽  

En este artículo examino algunos presupuestos filosóficos de Bertrand Russell que subyacen a su descubrimiento de las paradojas en la teoría de conjuntos de Cantor: los juicios a priori y sintéticos de Kant, el logicismo de Leibniz, la cosmovisión hegeliana, el argumento ontológico de san Anselmo.

Author(s):  
Howard Sankey

Abstract In The Problems of Philosophy, Bertrand Russell presents a justification of induction based on a principle he refers to as “the principle of induction.” Owing to the ambiguity of the notion of probability, the principle of induction may be interpreted in two different ways. If interpreted in terms of the subjective interpretation of probability, the principle of induction may be known a priori to be true. But it is unclear how this should give us any confidence in our use of induction, since induction is applied to the external world outside our minds. If the principle is interpreted in light of the objective interpretation of induction, it cannot be known to be true a priori, since it applies to frequencies that occur in the world outside the mind, and these cannot be known without recourse to experience. Russell’s principle of induction therefore fails to provide a satisfactory justification of induction.


Author(s):  
Vlatko Vedral

In our search for the ultimate law, P, that allows us to encode the whole of reality we have come across a very fundamental obstacle. As Deutsch argued, P cannot be all-encompassing, simply because it cannot explain its own origins. We need a law more fundamental than P, from which P can be derived. But then this more fundamental law also needs to come from somewhere. This is like the metaphor of the painter in the lunatic asylum, who is trying to paint a picture of the garden he is sitting in. He can never find a way to completely include himself in the picture and gets caught in an infinite regression. Does this mean we can never understand the whole of reality? Maybe so, given that any postulate that we start from needs its own explanation. Any law that underlies reality ultimately needs an a priori law. This puts us in a bit of a ‘Catch 22’ situation. So, are we resigned to failure or is there a way out? Is there some fundamental level at which events have no a priori causes and we can break the infinite regression? What does it mean for an event to have no a priori cause? This means that, even with all prior knowledge, we cannot infer that this event will take place. Furthermore, if there were genuinely acausal events in this Universe, this would imply a fundamentally random element of reality that cannot be reduced to anything deterministic. This is a hugely controversial area, with various proponents of religion, science, and philosophy all having a quite contrasting set of views on this. Often people get very emotional over this question, as it has profound implications for us as human beings. Could it be that some events just don’t have first causes? The British philosopher Bertrand Russell thought so. In Russell’s famous debate with Reverend Copleston on the origin of the world, Copleston thought everything must have a cause, and therefore the world has a cause – and this cause is ultimately God himself.


Author(s):  
Christopher A. Shrock

Howard Robinson and Bertrand Russell challenge the treatment of secondary qualities as objective, causally relevant, physical properties on non-empirical grounds. Robinson says that no combination of physical properties can account for the phenomenological aspects of secondary qualities. Russell, similarly, sees secondary qualities as knowable through acquainence, unlike scientific properties. Again, the answer involves a sharp distinction between perceived properties and sensations.


2010 ◽  
Vol 53 (2) ◽  
pp. 19-40
Author(s):  
Ivo Kara-Pesic

The Alexius Meinong's Theory of Objects is undoubtedly one of the most interesting, but also one of the most contested ontological theories. The main intent of the Austrian philosopher was to introduce an entirely new philosophical discipline which in comparison to traditional sciences is not conditioned by what he called the prejudice in favor of the reality (actual, existing being). Since we are naturally oriented towards the real, to investigate something, to attribute certain qualities to it, we must suppose that that something exists. The non-real is - according to this conception - a mere nothing, non-existing. Hence the need for a more encompassing theory of objects as such and objects in their totality, not as an overall science of specific sciences, but as an a priori science in its utmost generality and exstension. According to Meinong's own words, the Theory of Objects includes, by its apriority, objects of mathematics, but at the same time is more general than metaphysics, since the latter considers only the totality of the existing or real: its catalogue comprises the totality of objects of a much less exstension than the one giving a totality of the objects of knowledge. The first part of the paper presents a short introduction to the character and work of the Austrian thinker with a review of the Theory of Objects and the famous debate with Bertrand Russell. In the second part, the neomeinongian discussion between the main two currents - eliminativism and realism, is presented.


Author(s):  
D. E. Luzzi ◽  
L. D. Marks ◽  
M. I. Buckett

As the HREM becomes increasingly used for the study of dynamic localized phenomena, the development of techniques to recover the desired information from a real image is important. Often, the important features are not strongly scattering in comparison to the matrix material in addition to being masked by statistical and amorphous noise. The desired information will usually involve the accurate knowledge of the position and intensity of the contrast. In order to decipher the desired information from a complex image, cross-correlation (xcf) techniques can be utilized. Unlike other image processing methods which rely on data massaging (e.g. high/low pass filtering or Fourier filtering), the cross-correlation method is a rigorous data reduction technique with no a priori assumptions.We have examined basic cross-correlation procedures using images of discrete gaussian peaks and have developed an iterative procedure to greatly enhance the capabilities of these techniques when the contrast from the peaks overlap.


Author(s):  
H.S. von Harrach ◽  
D.E. Jesson ◽  
S.J. Pennycook

Phase contrast TEM has been the leading technique for high resolution imaging of materials for many years, whilst STEM has been the principal method for high-resolution microanalysis. However, it was demonstrated many years ago that low angle dark-field STEM imaging is a priori capable of almost 50% higher point resolution than coherent bright-field imaging (i.e. phase contrast TEM or STEM). This advantage was not exploited until Pennycook developed the high-angle annular dark-field (ADF) technique which can provide an incoherent image showing both high image resolution and atomic number contrast.This paper describes the design and first results of a 300kV field-emission STEM (VG Microscopes HB603U) which has improved ADF STEM image resolution towards the 1 angstrom target. The instrument uses a cold field-emission gun, generating a 300 kV beam of up to 1 μA from an 11-stage accelerator. The beam is focussed on to the specimen by two condensers and a condenser-objective lens with a spherical aberration coefficient of 1.0 mm.


2019 ◽  
Vol 4 (5) ◽  
pp. 878-892
Author(s):  
Joseph A. Napoli ◽  
Linda D. Vallino

Purpose The 2 most commonly used operations to treat velopharyngeal inadequacy (VPI) are superiorly based pharyngeal flap and sphincter pharyngoplasty, both of which may result in hyponasal speech and airway obstruction. The purpose of this article is to (a) describe the bilateral buccal flap revision palatoplasty (BBFRP) as an alternative technique to manage VPI while minimizing these risks and (b) conduct a systematic review of the evidence of BBFRP on speech and other clinical outcomes. A report comparing the speech of a child with hypernasality before and after BBFRP is presented. Method A review of databases was conducted for studies of buccal flaps to treat VPI. Using the principles of a systematic review, the articles were read, and data were abstracted for study characteristics that were developed a priori. With respect to the case report, speech and instrumental data from a child with repaired cleft lip and palate and hypernasal speech were collected and analyzed before and after surgery. Results Eight articles were included in the analysis. The results were positive, and the evidence is in favor of BBFRP in improving velopharyngeal function, while minimizing the risk of hyponasal speech and obstructive sleep apnea. Before surgery, the child's speech was characterized by moderate hypernasality, and after surgery, it was judged to be within normal limits. Conclusion Based on clinical experience and results from the systematic review, there is sufficient evidence that the buccal flap is effective in improving resonance and minimizing obstructive sleep apnea. We recommend BBFRP as another approach in selected patients to manage VPI. Supplemental Material https://doi.org/10.23641/asha.9919352


Addiction ◽  
1997 ◽  
Vol 92 (12) ◽  
pp. 1671-1698 ◽  
Author(s):  
Project Match Research Group
Keyword(s):  
A Priori ◽  

Diagnostica ◽  
2002 ◽  
Vol 48 (3) ◽  
pp. 115-120 ◽  
Author(s):  
Stefan Troche ◽  
Beatrice Rammstedt ◽  
Thomas Rammsayer
Keyword(s):  

Zusammenfassung. Der zunehmende Einsatz computergestützter diagnostischer Verfahren führt zwangsläufig zur Frage nach der Äquivalenz zwischen konventionellen Papier-Bleistift-Versionen und entsprechenden Computertranspositionen. Zur Überprüfung der Äquivalenz zwischen der computergestützten Version des Leistungsprüfsystems (LPS) im Hogrefe Testsystem und der Papier-Bleistift-Version wurden 131 Versuchspersonen mit beiden Verfahren getestet. Heterogene Ergebnisse zwischen der Papier-Bleistift- und der Computerversion belegen, dass nicht a priori von der Äquivalenz beider Versionen ausgegangen werden kann, und weisen nachdrücklich auf die Notwendigkeit systematischer Äquivalenzprüfungen hin. Eine an Hand einer zweiten Stichprobe von 40 Testpersonen durchgeführte Überprüfung der Retest-Reliabilität der computergestützten Version des LPS ergab für ein Retest-Intervall von zwei Wochen Reliabilitätskoeffizienten zwischen rtt = 0.55 und rtt = 0.94. In der Diskussion werden mögliche Gründe für die Nicht-Äquivalenz der beiden LPS-Versionen identifiziert.


Sign in / Sign up

Export Citation Format

Share Document