scholarly journals Nonlinear stochastic filtration of satellite measurements

2021 ◽  
Vol 2131 (2) ◽  
pp. 022128
Author(s):  
I V Reshetnikova ◽  
S V Sokolov ◽  
A A Manin ◽  
M V Polyakova ◽  
M S Gerasimenko

Abstract Existing methods for processing satellite measurements are based on the use of either the least squares method in different versions, or with the known model of motion of an object – various modifications of the Kalman filter. At the same time, the Kalman approach is more accurate, since it takes into account the dynamics of the movement of the object and the history of estimates, but its significant drawback is the need for a priori knowledge of the equations of motion of the object. In this regard, a new approach is proposed to assess the navigation parameters of the object by satellite measurements. On the one hand, this approach takes into account the dynamic nature of motion parameters and the history of estimates, and on the other hand, free from restriction in the form of accurate knowledge of the equations of motion of an object. The effectiveness of the considered approach in comparison with the existing traditional methods of processing satellite measurements is confirmed by the results of numerical modeling.

Traditio ◽  
1943 ◽  
Vol 1 ◽  
pp. 355-408 ◽  
Author(s):  
Gaines Post

By the end of the thirteenth century the royal writ of summons to Parliament usually specified that communities send representatives with “full power” to consent to whatever should be ordained by the king in his court and council. This “full power” was the famous plena potestas which was stated in the mandates carried by knights and burgesses to Parliament and by delegates of cities and towns to Cortes and States General, and which is still current in proxies for stockholders' meetings. It has, of course, like almost every word of the terminology in documents relating to representation, challenged interpretation: on the one side is the argument of J. G. Edwards, who confines himself to England, that plena potestas implied an almost political or sovereign consent which limited the royal authority; on the other, the assumption that it was an expression of involuntary consent to the acts and decisions of the royal government. In general, of course, whatever modern scholars have decided as to the right of consent has resulted either from modern conceptions of representation or from a strict interpretation of the terminology in the sources for the history of assemblies. No one has examined plena potestas in the light of the legal theory and procedure of the thirteenth century It is possible that by studying how legists and canonists viewed the meaning of plena potestas—for it, like most of the terminology in the mandate, came from Roman Law—we can find at least a relatively new approach to the problem of medieval consent.


1977 ◽  
Vol 9 (1-2) ◽  
pp. 84-104 ◽  
Author(s):  
Maria Amélia Cabral ◽  
Jorge Afonso Garcia

The study and analysis of the various factors influencing insurance risks constitutes an intricate and usually quite extensive problem. We have to consider on the one hand the nature and heterogeneity of the elements we have been able to measure, and on the other the problem of deciding—without knowing exactly what results to expect—on the types of analysis to carry out and the form in which to present the results.These difficulties, essentially stemming from the fact that we cannot easily define “a priori” a measure of influence, can be overcome only by using highly sophisticated mathematical models. The researcher must define his objectives clearly if he is to avoid spending too much of his time in exploring such models.Either for these reasons or for lack of our experience in this field we were led to the study of three models, presenting entirely different characteristics though based on the analysis and behaviour of mean value fluctuations, measured by their variances or by the least-squares method.Our first model, described in II. 1, associates the notion of influence with the notion of variance. It analyses in detail the alteration of the mean values variance, when what we refer to as a “margination” is executed in the parameter space, taking each of the parameters in turn. We start off by having n distinct parameters, reducing them by one with each step.


2007 ◽  
Vol 61 ◽  
pp. 179-201 ◽  
Author(s):  
James Ladyman

According to logical positivism, so the story goes, metaphysical questions are meaningless, since they do not admit of empirical confirmation or refutation. However, the logical positivists did not in fact reject as meaningless all questions about for example, the structure of space and time. Rather, key figures such as Reichenbach and Schlick believed that scientific theories often presupposed a conceptual framework that was not itself empirically testable, but which was required for the theory as a whole to be empirically testable. For example, the theory of Special Relativity relies upon the simultaneity convention introduced by Einstein that assumes that the one-way speed of light is the same in all directions of space. Hence, the logical positivists accepted an a priori component to physical theories. However, they denied that this a priori component is necessarily true. Whereas for Kant, metaphysics is the a priori science of the necessary structure of rational thought about reality (rather than about things in themselves), the logical positivists were forced by the history of science to accept that the a priori structure of theories could change. Hence, they defended a notion of what Michael Friedman (1999) calls the ‘relativised’ or the ‘constitutive’ a priori. Carnap and Reichenbach held that such an a priori framework was conventional, whereas Schlick seems to have been more of a realist and held that the overall relative simplicity of different theories could count as evidence for their truth, notwithstanding the fact that some parts of them are not directly testable. All this is part of the story of how the verification principle came to be abandoned, and how logical positivism transmuted into logical empiricism.


1898 ◽  
Vol 18 ◽  
pp. 129-132 ◽  
Author(s):  
Paul F. Perdrizet

The Imperial Ottoman Museum has recently acquired a very valuable and interesting gold ring (Fig. 1) which was found in 1894 or 1895 in a tomb at Lampsacus. The Museum authorities subsequently undertook further excavations in the necropolis of which this tomb formed part, and it is a matter for great regret that no detailed report of the results was drawn up; we are therefore forced to content ourselves with the somewhat meagre information given by the late Baltazzi-Bey to M. Salomon Reinach, according to which the necropolis yielded fragments of red-figured pottery and specimens of silver autonomous coins of Lampsacus. Both these details are of importance in fixing the date of the ring; for on the one hand silver coins of this class belong almost exclusively to the fourth century, and on the other, the manufacture of painted vases was not continued after that date. When we add that the evidence of coins and inscriptions proves that this was the most flourishing period in the history of Lampsacus, we have strong a priori reason for assigning the ring to this century, while a consideration of the style of the intaglio may help us to fix the date within narrower limits.


2015 ◽  
Vol 9 (2) ◽  
pp. 240-257 ◽  
Author(s):  
Dalia Nassar

In 1785 Kant published a series of critical reviews of Johann Gottfried Herder’s Ideas for a Philosophy of the History of Humanity (1784–1785), in which he not only challenges Herder’s conception of nature but also, and more importantly, his methodology. Kant’s complaint is that by relying on analogy, Herder draws deeply mistaken conclusions that overlook fundamental differences between human and nonhuman beings. But was Kant’s critique of Herder entirely fair? And how does it compare to Kant’s own use of analogy? My claim is that Herder’s use of analogy posed a fundamental methodological challenge to Kant, a challenge he sought to meet in the years following the reviews. In so doing, however, Kant found himself in the untenable situation of, on the one hand, granting analogy greater significance, and, on the other, severely restricting its use. By tracing the shifts in Kant’s thought through the lens of analogy, I aim to show that Kant’s transformed understanding of analogy reveals a fundamental tension between his a priori “metaphysics of nature” and empirical science, a tension that fundamentally shaped the philosophies of nature after Kant.


Author(s):  
Concha Gómez-Mena

The plants we eat are the outcome of a humans’ long history of domestication of wild species. The introduction of CRISPR/Cas gene-editing technology has provided a new approach to crop improvement and offers an interesting range of possibilities for obtaining varieties with new and healthier characteristics. The technology is based on two fundamental pillars: on the one hand, knowing complete genome sequences, and on the other, identifying gene functions. In less than a decade, the prospect of being able to design plants on demand is now no longer a dream, but a real possibility.


2017 ◽  
Author(s):  
Paola Bonizzoni ◽  
Simone Ciccolella ◽  
Gianluca Della Vedova ◽  
Mauricio Soto

AbstractMost of the evolutionary history reconstruction approaches are based on the infinite site assumption, which is underlying the Perfect Phylogeny model and whose main consequence is that acquired mutation can never lost. This results in the clonal model used to explain cancer evolution. Some recent results gives a strong evidence that recurrent and back mutations are present in the evolutionary history of tumors [5,21], thus showing that more general models then the Perfect Phylogeny are required. We propose a new approach that incorporates the possibility of losing a previously acquired mutation, extending the Persistent Phylogeny model [1].We exploit our model to provide an ILP formulation of the problem of reconstructing trees on mixed populations, where the input data consists of the fraction of cells in a set of samples that have a certain mutation. This is a fundamental problem in cancer genomics, where the goal is to study the evolutionary history of a tumor. An experimental analysis shows the usefulness of allowing mutation losses, by studying some real and simulated datasets where our ILP approach provides a better interpretation than the one obtained under perfect phylogeny assumption. Finally, we show how to incorporate multiple back mutations and recurrent mutations in our model.


1963 ◽  
Vol 4 (1) ◽  
pp. 22-30
Author(s):  
Sartono Kartodirdjo

It should be plainly stated that historical study and writing in Indonesia have so far played a very slight part in academic work, “but there is reason to expect that this situation will change in the years to come. Indonesia entered the postwar period with a heritage consisting mostly of Dutch colonial historiography. Deeply affected by the national revolution in the cultural scene, particularly in the field of history, a reconstruction and rewriting of Indonesian history was urgently felt. Many conceptions had tobe reviewed and many facts reinterpreted. A growing acceptance of this new approach has come to focus historical study on old regional or local sources on the one hand and the formulation of the idea of the history of indonesia as a national history on the other.


2007 ◽  
Vol 61 ◽  
pp. 179-201
Author(s):  
James Ladyman

According to logical positivism, so the story goes, metaphysical questions are meaningless, since they do not admit of empirical confirmation or refutation. However, the logical positivists did not in fact reject as meaningless all questions about for example, the structure of space and time. Rather, key figures such as Reichenbach and Schlick believed that scientific theories often presupposed a conceptual framework that was not itself empirically testable, but which was required for the theory as a whole to be empirically testable. For example, the theory of Special Relativity relies upon the simultaneity convention introduced by Einstein that assumes that the one-way speed of light is the same in all directions of space. Hence, the logical positivists accepted an a priori component to physical theories. However, they denied that this a priori component is necessarily true. Whereas for Kant, metaphysics is the a priori science of the necessary structure of rational thought about reality (rather than about things in themselves), the logical positivists were forced by the history of science to accept that the a priori structure of theories could change. Hence, they defended a notion of what Michael Friedman (1999) calls the ‘relativised’ or the ‘constitutive’ a priori. Carnap and Reichenbach held that such an a priori framework was conventional, whereas Schlick seems to have been more of a realist and held that the overall relative simplicity of different theories could count as evidence for their truth, notwithstanding the fact that some parts of them are not directly testable. All this is part of the story of how the verification principle came to be abandoned, and how logical positivism transmuted into logical empiricism.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Benedetta Tondi ◽  
Andrea Costanzo ◽  
Dequ Huang ◽  
Bin Li

AbstractEstimating the primary quantization matrix of double JPEG compressed images is a problem of relevant importance in image forensics since it allows to infer important information about the past history of an image. In addition, the inconsistencies of the primary quantization matrices across different image regions can be used to localize splicing in double JPEG tampered images. Traditional model-based approaches work under specific assumptions on the relationship between the first and second compression qualities and on the alignment of the JPEG grid. Recently, a deep learning-based estimator capable to work under a wide variety of conditions has been proposed that outperforms tailored existing methods in most of the cases. The method is based on a convolutional neural network (CNN) that is trained to solve the estimation as a standard regression problem. By exploiting the integer nature of the quantization coefficients, in this paper, we propose a deep learning technique that performs the estimation by resorting to a simil-classification architecture. The CNN is trained with a loss function that takes into account both the accuracy and the mean square error (MSE) of the estimation. Results confirm the superior performance of the proposed technique, compared to the state-of-the art methods based on statistical analysis and, in particular, deep learning regression. Moreover, the capability of the method to work under general operative conditions, regarding the alignment of the second compression grid with the one of first compression and the combinations of the JPEG qualities of former and second compression, is very relevant in practical applications, where these information are unknown a priori.


Sign in / Sign up

Export Citation Format

Share Document