CAN THE MATHEMATICAL STRUCTURE OF SPACE BE KNOWN A PRIORI? A TALE OF TWO POSTULATES

2014 ◽  
Author(s):  
Edwin Mares
Mathematics ◽  
2020 ◽  
Vol 8 (5) ◽  
pp. 799 ◽  
Author(s):  
Won-Kwang Park

It is well-known that subspace migration is a stable and effective non-iterative imaging technique in inverse scattering problem. However, for a proper application, a priori information of the shape of target must be estimated. Without this consideration, one cannot retrieve good results via subspace migration. In this paper, we identify the mathematical structure of single- and multi-frequency subspace migration without any a priori of unknown targets and explore its certain properties. This is based on the fact that elements of so-called multi-static response (MSR) matrix can be represented as an asymptotic expansion formula. Furthermore, based on the examined structure, we improve subspace migration and consider the multi-frequency subspace migration. Various results of numerical simulation with noisy data support our investigation.


1982 ◽  
Vol 47 (3) ◽  
pp. 495-548 ◽  
Author(s):  
Haim Gaifman ◽  
Marc Snir

The basic concept underlying probability theory and statistics is a function assigning numerical values (probabilities) to events. An “event” in this context is any conceivable state of affairs including the so-called “empty event”—an a priori impossible state. Informally, events are described in everyday language (e.g. “by playing this strategy I shall win $1000 before going broke”). But in the current mathematical framework (first proposed by Kolmogoroff [Ko 1]) they are identified with subsets of some all-inclusive set Q. The family of all events constitutes a field, or σ-field, and the logical connectives ‘and’, ‘or’ and ‘not’ are translated into the set-theoretical operations of intersection, union and complementation. The points of Q can be regarded as possible worlds and an event as the set of all worlds in which it takes place. The concept of a field of sets is wide enough to accommodate all cases and to allow for a general abstract foundation of the theory. On the other hand it does not reflect distinctions that arise out of the linguistic structure which goes into the description of our events. Since events are always described in some language they can be indentified with the sentences that describe them and the probability function can be regarded as an assignment of values to sentences. The extensive accumulated knowledge concerning formal languages makes such a project feasible. The study of probability functions defined over the sentences of a rich enough formal language yields interesting insights in more than one direction.Our present approach is not an alternative to the accepted Kolmogoroff axiomatics. In fact, given some formal language L, we can consider a rich enough set, say Q, of models for L (called also in this work “worlds”) and we can associate with every sentence the set of all worlds in Q in which the sentence is true. Thus our probabilities can be considered also as measures over some field of sets. But the introduction of the language adds mathematical structure and makes for distinctions expressing basic intuitions that cannot be otherwise expressed. As an example we mention here the concept of a random sequence or, more generally, a random world, or a world which is typical to a certain probability distribution.


PeerJ ◽  
2019 ◽  
Vol 7 ◽  
pp. e8127
Author(s):  
Evangelos Vlachos

Background In order to designate the various concepts of taxa in biology, evolution and paleontology, scientists have developed various rules on how to create unique names for taxa. Different Codes of Nomenclature have been developed for animals, plants, fungi, bacteria etc., with standard sets of Rules that govern the formation, publication and application of the nomina of extant and extinct species. These Codes are the result of decades of discussions, workshops, publications and revisions. The structure and complexity of these Codes have been criticized many times by zoologists. This project aims, using the International Code of Zoological Nomenclature as a case study, to show that the structure of these Codes is better reflected and understood as networks. Methods The majority of the text of the Code has been divided into hundreds of Nodes of different types, connected to each other with different types of Edges to form a network. The various mathematical descriptors of the entire system, as well as for the elements of the network, have been conceptually framed to help describing and understanding the Code as a network. Results The network of the Code comprises 1,379 Nodes, which are connected with 11,276 Edges. The structure of the Code can be accurately described as a network, a mathematical structure that is better suited than any kind of linear text publication to reflect its structure. Discussion Thinking of the Code as a network allows a better, in-depth understanding of the Code itself, as the user can navigate in a more efficient way, as well as to depict and analyze all the implied connections between the various parts of the Code that are not visible immediately. The network of the Code is an open access tool that could also help teaching, using and disseminating the Code. More importantly, this network is a powerful tool that allows identifying a priori the parts of the Code that could be potentially affected by upcoming amendment and revisions. This kind of analysis is not limited to nomenclature, as it could be applied to other fields that use complex textbooks with long editing history, such as Law, Medicine and Linguistics.


1954 ◽  
Vol 48 (3) ◽  
pp. 787-792 ◽  
Author(s):  
L. S. Shapley ◽  
Martin Shubik

In the following paper we offer a method for the a priori evaluation of the division of power among the various bodies and members of a legislature or committee system. The method is based on a technique of the mathematical theory of games, applied to what are known there as “simple games” and “weighted majority games.” We apply it here to a number of illustrative cases, including the United States Congress, and discuss some of its formal properties.The designing of the size and type of a legislative body is a process that may continue for many years, with frequent revisions and modifications aimed at reflecting changes in the social structure of the country; we may cite the role of the House of Lords in England as an example. The effect of a revision usually cannot be gauged in advance except in the roughest terms; it can easily happen that the mathematical structure of a voting system conceals a bias in power distribution unsuspected and unintended by the authors of the revision. How, for example, is one to predict the degree of protection which a proposed system affords to minority interests? Can a consistent criterion for “fair representation” be found? It is difficult even to describe the net effect of a double representation system such as is found in the U. S. Congress (i.e., by states and by population), without attempting to deduce it a priori. The method of measuring “power” which we present in this paper is intended as a first step in the attack on these problems.


Author(s):  
D. E. Luzzi ◽  
L. D. Marks ◽  
M. I. Buckett

As the HREM becomes increasingly used for the study of dynamic localized phenomena, the development of techniques to recover the desired information from a real image is important. Often, the important features are not strongly scattering in comparison to the matrix material in addition to being masked by statistical and amorphous noise. The desired information will usually involve the accurate knowledge of the position and intensity of the contrast. In order to decipher the desired information from a complex image, cross-correlation (xcf) techniques can be utilized. Unlike other image processing methods which rely on data massaging (e.g. high/low pass filtering or Fourier filtering), the cross-correlation method is a rigorous data reduction technique with no a priori assumptions.We have examined basic cross-correlation procedures using images of discrete gaussian peaks and have developed an iterative procedure to greatly enhance the capabilities of these techniques when the contrast from the peaks overlap.


Author(s):  
H.S. von Harrach ◽  
D.E. Jesson ◽  
S.J. Pennycook

Phase contrast TEM has been the leading technique for high resolution imaging of materials for many years, whilst STEM has been the principal method for high-resolution microanalysis. However, it was demonstrated many years ago that low angle dark-field STEM imaging is a priori capable of almost 50% higher point resolution than coherent bright-field imaging (i.e. phase contrast TEM or STEM). This advantage was not exploited until Pennycook developed the high-angle annular dark-field (ADF) technique which can provide an incoherent image showing both high image resolution and atomic number contrast.This paper describes the design and first results of a 300kV field-emission STEM (VG Microscopes HB603U) which has improved ADF STEM image resolution towards the 1 angstrom target. The instrument uses a cold field-emission gun, generating a 300 kV beam of up to 1 μA from an 11-stage accelerator. The beam is focussed on to the specimen by two condensers and a condenser-objective lens with a spherical aberration coefficient of 1.0 mm.


2019 ◽  
Vol 4 (5) ◽  
pp. 878-892
Author(s):  
Joseph A. Napoli ◽  
Linda D. Vallino

Purpose The 2 most commonly used operations to treat velopharyngeal inadequacy (VPI) are superiorly based pharyngeal flap and sphincter pharyngoplasty, both of which may result in hyponasal speech and airway obstruction. The purpose of this article is to (a) describe the bilateral buccal flap revision palatoplasty (BBFRP) as an alternative technique to manage VPI while minimizing these risks and (b) conduct a systematic review of the evidence of BBFRP on speech and other clinical outcomes. A report comparing the speech of a child with hypernasality before and after BBFRP is presented. Method A review of databases was conducted for studies of buccal flaps to treat VPI. Using the principles of a systematic review, the articles were read, and data were abstracted for study characteristics that were developed a priori. With respect to the case report, speech and instrumental data from a child with repaired cleft lip and palate and hypernasal speech were collected and analyzed before and after surgery. Results Eight articles were included in the analysis. The results were positive, and the evidence is in favor of BBFRP in improving velopharyngeal function, while minimizing the risk of hyponasal speech and obstructive sleep apnea. Before surgery, the child's speech was characterized by moderate hypernasality, and after surgery, it was judged to be within normal limits. Conclusion Based on clinical experience and results from the systematic review, there is sufficient evidence that the buccal flap is effective in improving resonance and minimizing obstructive sleep apnea. We recommend BBFRP as another approach in selected patients to manage VPI. Supplemental Material https://doi.org/10.23641/asha.9919352


Addiction ◽  
1997 ◽  
Vol 92 (12) ◽  
pp. 1671-1698 ◽  
Author(s):  
Project Match Research Group
Keyword(s):  
A Priori ◽  

Diagnostica ◽  
2002 ◽  
Vol 48 (3) ◽  
pp. 115-120 ◽  
Author(s):  
Stefan Troche ◽  
Beatrice Rammstedt ◽  
Thomas Rammsayer
Keyword(s):  

Zusammenfassung. Der zunehmende Einsatz computergestützter diagnostischer Verfahren führt zwangsläufig zur Frage nach der Äquivalenz zwischen konventionellen Papier-Bleistift-Versionen und entsprechenden Computertranspositionen. Zur Überprüfung der Äquivalenz zwischen der computergestützten Version des Leistungsprüfsystems (LPS) im Hogrefe Testsystem und der Papier-Bleistift-Version wurden 131 Versuchspersonen mit beiden Verfahren getestet. Heterogene Ergebnisse zwischen der Papier-Bleistift- und der Computerversion belegen, dass nicht a priori von der Äquivalenz beider Versionen ausgegangen werden kann, und weisen nachdrücklich auf die Notwendigkeit systematischer Äquivalenzprüfungen hin. Eine an Hand einer zweiten Stichprobe von 40 Testpersonen durchgeführte Überprüfung der Retest-Reliabilität der computergestützten Version des LPS ergab für ein Retest-Intervall von zwei Wochen Reliabilitätskoeffizienten zwischen rtt = 0.55 und rtt = 0.94. In der Diskussion werden mögliche Gründe für die Nicht-Äquivalenz der beiden LPS-Versionen identifiziert.


Sign in / Sign up

Export Citation Format

Share Document