The Doctrine of Ideality and the A Priori in the Logical Investigations

Author(s):  
Daniele De Santis
2015 ◽  
Vol 2 (1) ◽  
pp. 90-107
Author(s):  
Luciana de Souza Gracioso ◽  
Lourival Pereira Pinto

O plano ampliado e reconfigurado dos fluxos de informação na contemporaneidade sugerem novas condições para que a ação da leitura e da interpretação possa ser estabelecida. Com o objetivo de refletir sobre as implicações que fazem parte destas ações desenvolvemos um exame dialético parcial sobre três obras - Ser e Tempo, de M. Heidegger, Investigações Lógicas, Sexta Investigação: Elementos de uma Elucidação Fenomenológica do Conhecimento, de E. Husserl, e Investigações Filosóficas, de L. Wittgenstein - que, embora não tenham tido o objetivo direto de analisar a leitura e a interpretação, parecem oferecer elementos para refletirmos sobre estes fenômenos na contemporaneidade. Ao final, nos posicionamos a favor da interpretação enquanto uma ação social a priori e discorremos, sinteticamente, sobre a configuração dos atuais sistemas de informação, considerando tal condição. ABOUT THE LIMITS AND SCOPE OF INTERPRETATION: REFLECTIONS FROM HEIDEGGER, HUSSERL AND WITTGENSTEIN AbstractThe expanded and reconfigured plan of information's flow in contemporary can suggest new conditions for the action of interpretation be established. In order to reflect on the implications of this action be part of interpreting process, we have developed a partial dialectical examination of three works - Being and time, of M. Heidegger, Logical Investigations, Sixth Research: Elements of a Phenomenological Elucidation of Knowledge, of E. Husserl, and Philosophical Investigations, of L. Wittgenstein - which, although these works had not the direct interest to address the phenomenon of interpretation, seem to offer the necessary elements to reflect on this phenomenon in the nowadays. In the end, we position ourselves in favor of interpretation while social phenomenon placed a priori, and commented above, briefly, about the configuration of current information systems, considering that condition.


2020 ◽  
Vol 60 (4) ◽  
pp. 449-471
Author(s):  
Stathis Livadas ◽  

Phenomenology can be roughly described as the theory of the pure essences of phenomena. Yet the meaning of essence and of concepts traditionally tied to it (such as the concepts of a priori and of essential necessity) are far from settled. This is especially true given the impact modern science has had on established philosophical views and the need for revisiting certain core notions of philosophy. In this paper I intend to review Husserl’s view on thingness-essence and his conception of the essence of individuals, based mainly in his writings from the time of Logical Investigations, Ideas, and later of Experience and Judgment. Taking account of the work of Lothar Eley in Die Krise des Apriori, among others, I will inquire into the ways in which phenomenology may undermine (one could even say fully “destroy”) the view of essences as non-factual, as well as undermine their ontological priority. Doing so may help to shape a conception of material or formal individual essences and generally of essences as concrete objects of experience in virtue of well-defined epistemic ones.


Author(s):  
Françoise Dastur ◽  
Robert Vallier

This chapter examines Edmund Husserl's philosophical reflections on pure logical grammar. When we talk about the meaning and genealogy of the notion of a “philosophical grammar,” the fourth of Husserl's Logical Investigations comes to mind. The central idea of the fourth Investigation is that all formal logic in the current sense— that is, the logic of validity—presupposes a logic of meaning that prevents non-sense and that is not concerned with objective validity, but instead only with the a priori laws that establish the conditions for the unity of meaning. In the first Investigation, Husserl proposes to establish the juncture between pure logic and language. The chapter suggests that the theme of the fourth Investigation is not the edification of a universal grammar, but rather of a pure grammar capable of serving as a foundation for logic.


Author(s):  
D. E. Luzzi ◽  
L. D. Marks ◽  
M. I. Buckett

As the HREM becomes increasingly used for the study of dynamic localized phenomena, the development of techniques to recover the desired information from a real image is important. Often, the important features are not strongly scattering in comparison to the matrix material in addition to being masked by statistical and amorphous noise. The desired information will usually involve the accurate knowledge of the position and intensity of the contrast. In order to decipher the desired information from a complex image, cross-correlation (xcf) techniques can be utilized. Unlike other image processing methods which rely on data massaging (e.g. high/low pass filtering or Fourier filtering), the cross-correlation method is a rigorous data reduction technique with no a priori assumptions.We have examined basic cross-correlation procedures using images of discrete gaussian peaks and have developed an iterative procedure to greatly enhance the capabilities of these techniques when the contrast from the peaks overlap.


Author(s):  
H.S. von Harrach ◽  
D.E. Jesson ◽  
S.J. Pennycook

Phase contrast TEM has been the leading technique for high resolution imaging of materials for many years, whilst STEM has been the principal method for high-resolution microanalysis. However, it was demonstrated many years ago that low angle dark-field STEM imaging is a priori capable of almost 50% higher point resolution than coherent bright-field imaging (i.e. phase contrast TEM or STEM). This advantage was not exploited until Pennycook developed the high-angle annular dark-field (ADF) technique which can provide an incoherent image showing both high image resolution and atomic number contrast.This paper describes the design and first results of a 300kV field-emission STEM (VG Microscopes HB603U) which has improved ADF STEM image resolution towards the 1 angstrom target. The instrument uses a cold field-emission gun, generating a 300 kV beam of up to 1 μA from an 11-stage accelerator. The beam is focussed on to the specimen by two condensers and a condenser-objective lens with a spherical aberration coefficient of 1.0 mm.


2019 ◽  
Vol 4 (5) ◽  
pp. 878-892
Author(s):  
Joseph A. Napoli ◽  
Linda D. Vallino

Purpose The 2 most commonly used operations to treat velopharyngeal inadequacy (VPI) are superiorly based pharyngeal flap and sphincter pharyngoplasty, both of which may result in hyponasal speech and airway obstruction. The purpose of this article is to (a) describe the bilateral buccal flap revision palatoplasty (BBFRP) as an alternative technique to manage VPI while minimizing these risks and (b) conduct a systematic review of the evidence of BBFRP on speech and other clinical outcomes. A report comparing the speech of a child with hypernasality before and after BBFRP is presented. Method A review of databases was conducted for studies of buccal flaps to treat VPI. Using the principles of a systematic review, the articles were read, and data were abstracted for study characteristics that were developed a priori. With respect to the case report, speech and instrumental data from a child with repaired cleft lip and palate and hypernasal speech were collected and analyzed before and after surgery. Results Eight articles were included in the analysis. The results were positive, and the evidence is in favor of BBFRP in improving velopharyngeal function, while minimizing the risk of hyponasal speech and obstructive sleep apnea. Before surgery, the child's speech was characterized by moderate hypernasality, and after surgery, it was judged to be within normal limits. Conclusion Based on clinical experience and results from the systematic review, there is sufficient evidence that the buccal flap is effective in improving resonance and minimizing obstructive sleep apnea. We recommend BBFRP as another approach in selected patients to manage VPI. Supplemental Material https://doi.org/10.23641/asha.9919352


Addiction ◽  
1997 ◽  
Vol 92 (12) ◽  
pp. 1671-1698 ◽  
Author(s):  
Project Match Research Group
Keyword(s):  
A Priori ◽  

Diagnostica ◽  
2002 ◽  
Vol 48 (3) ◽  
pp. 115-120 ◽  
Author(s):  
Stefan Troche ◽  
Beatrice Rammstedt ◽  
Thomas Rammsayer
Keyword(s):  

Zusammenfassung. Der zunehmende Einsatz computergestützter diagnostischer Verfahren führt zwangsläufig zur Frage nach der Äquivalenz zwischen konventionellen Papier-Bleistift-Versionen und entsprechenden Computertranspositionen. Zur Überprüfung der Äquivalenz zwischen der computergestützten Version des Leistungsprüfsystems (LPS) im Hogrefe Testsystem und der Papier-Bleistift-Version wurden 131 Versuchspersonen mit beiden Verfahren getestet. Heterogene Ergebnisse zwischen der Papier-Bleistift- und der Computerversion belegen, dass nicht a priori von der Äquivalenz beider Versionen ausgegangen werden kann, und weisen nachdrücklich auf die Notwendigkeit systematischer Äquivalenzprüfungen hin. Eine an Hand einer zweiten Stichprobe von 40 Testpersonen durchgeführte Überprüfung der Retest-Reliabilität der computergestützten Version des LPS ergab für ein Retest-Intervall von zwei Wochen Reliabilitätskoeffizienten zwischen rtt = 0.55 und rtt = 0.94. In der Diskussion werden mögliche Gründe für die Nicht-Äquivalenz der beiden LPS-Versionen identifiziert.


Sign in / Sign up

Export Citation Format

Share Document