METHOD FOR ESTIMATING PARAMETERS OF GENERALIZED MODEL OF LUMINOSITY OF SPACE OBJECTS BASED ON DATA FROM DIFFERENT RANGES

2018 ◽  
pp. 57-62
Author(s):  
E. I. Gundrova ◽  
A. P. Lukyanov ◽  
A. V. Pruglo ◽  
S. S. Ravdin

Previously, the authors have proposed a generalized model for estimating the distribution law parameters of luminosity of space objects, assuming that not only successful but also unsuccessful measurement results are taken into account. Estimation was done on the data of observations under similar conditions: phase angle, range, sensibility of the telescope. The algorithm under such limitations was tested on model data and real measurements. Therefore, obtained results showed that algorithm did not fit for cases of changing range of space objects. In this work, the new algorithm, that allows to merge information from different ranges to the observed space object, is proposed. In this case, luminosity values are reduced to the ones at a reference distance of 1000 km considering sensibility of the telescope. To obtain estimates of the parameters the Cramer-Mises-Smirnov criterion is used. This algorithm was tested on model data and results of its work on real data were obtained. The data showed correct work of the algorithm and also confirmed the practicability of organization the registration of unsuccessful measurements.

Author(s):  
A.O. Zhukov ◽  
N.A. Kupriyanov ◽  
S.V. Logunov ◽  
D.K. Khegai ◽  
B.P. Sidorov

The article deals with the use of the results of measuring the catalogued space objects coordinates by a radar station for long-range detection in the interests of differential correction of coordinate-time and navigation support for consumers. We describe the idea of comparing trajectory data that allows us to calculate the total electronic content in the direction of a catalyzed space object under the assumption of a thin layer at the height of the ionosphere maximum. The main stages of the method of differential correction of coordinates-but-time and navigation support for consumers based on the results of comparing trajectory data are described. The results of modeling are presented, which allow us to evaluate the possible positive effect when using the proposed approach.


Author(s):  
Edmond Boulle

This chapter outlines certain core legal topics that arise in connection with the delivery of a separated payload into or beyond Earth orbit. The first part deals with some of the established approaches to procuring launch services, as well as some of the common features of launch service agreements that balance the interests of the launch service provider and its customer. The second part of the chapter looks at governmental authorization required to carry out a launch. While safety standards and success rates continually improve, launching a space object is still the riskiest part of most space missions and is therefore a carefully regulated aspect of space activity, with participants having to obtain prior authorization from a competent national authority. Finally, the third part explores some of the legal consequences in international law of launching a space object, including the maintenance of a register of space objects launched, and the burden of liability that is placed on “launching states.”


Algorithms ◽  
2020 ◽  
Vol 13 (5) ◽  
pp. 107 ◽  
Author(s):  
Otmane Azeroual ◽  
Włodzimierz Lewoniewski

The quality assurance of publication data in collaborative knowledge bases and in current research information systems (CRIS) becomes more and more relevant by the use of freely available spatial information in different application scenarios. When integrating this data into CRIS, it is necessary to be able to recognize and assess their quality. Only then is it possible to compile a result from the available data that fulfills its purpose for the user, namely to deliver reliable data and information. This paper discussed the quality problems of source metadata in Wikipedia and CRIS. Based on real data from over 40 million Wikipedia articles in various languages, we performed preliminary quality analysis of the metadata of scientific publications using a data quality tool. So far, no data quality measurements have been programmed with Python to assess the quality of metadata from scientific publications in Wikipedia and CRIS. With this in mind, we programmed the methods and algorithms as code, but presented it in the form of pseudocode in this paper to measure the quality related to objective data quality dimensions such as completeness, correctness, consistency, and timeliness. This was prepared as a macro service so that the users can use the measurement results with the program code to make a statement about their scientific publications metadata so that the management can rely on high-quality data when making decisions.


2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Xiaoyuan Ren ◽  
Libing Jiang ◽  
Zhuang Wang

Estimating the 3D pose of the space object from a single image is an important but challenging work. Most of the existing methods estimate the 3D pose of known space objects and assume that the detailed geometry of a specific object is known. These methods are not available for unknown objects without the known geometry of the object. In contrast to previous works, this paper devotes to estimate the 3D pose of the unknown space object from a single image. Our method estimates not only the pose but also the shape of the unknown object from a single image. In this paper, a hierarchical shape model is proposed to represent the prior structure information of typical space objects. On this basis, the parameters of the pose and shape are estimated simultaneously for unknown space objects. Experimental results demonstrate the effectiveness of our method to estimate the 3D pose and infer the geometry of unknown typical space objects from a single image. Moreover, experimental results show the advantage of our approach over the methods relying on the known geometry of the object.


2019 ◽  
Vol 490 (1) ◽  
pp. 909-926 ◽  
Author(s):  
M S Cunha ◽  
P P Avelino ◽  
J Christensen-Dalsgaard ◽  
D Stello ◽  
M Vrard ◽  
...  

ABSTRACT The characterization of stellar cores may be accomplished through the modelling of asteroseismic data from stars exhibiting either gravity-mode or mixed-mode pulsations, potentially shedding light on the physical processes responsible for the production, mixing, and segregation of chemical elements. In this work, we validate against model data an analytical expression for the period spacing that will facilitate the inference of the properties of stellar cores, including the detection and characterization of buoyancy glitches (strong chemical gradients). This asymptotically based analytical expression is tested both in models with and without buoyancy glitches. It does not assume that glitches are small and, consequently, predicts non-sinusoidal glitch-induced period-spacing variations, as often seen in model and real data. We show that the glitch position and width inferred from the fitting of the analytical expression to model data consisting of pure gravity modes are in close agreement (typically better than 7 ${{\ \rm per\ cent}}$ relative difference) with the properties measured directly from the stellar models. In the case of fitting mixed-mode model data, the same expression is shown to reproduce well the numerical results, when the glitch properties are known a priori. In addition, the fits performed to mixed-mode model data reveal a frequency dependence of the coupling coefficient, q, for a moderate-luminosity red-giant-branch model star. Finally, we find that fitting the analytical expression to the mixed-mode period spacings may provide a way to infer the frequencies of the pure acoustic dipole modes that would exist if no coupling took place between acoustic and gravity waves.


Information ◽  
2019 ◽  
Vol 10 (10) ◽  
pp. 296 ◽  
Author(s):  
Wanjie Lu ◽  
Qing Xu ◽  
Chaozhen Lan

With the advancement of various technologies, the research and application of space object optical characteristic (SOOC), one of the main characteristics of space objects, are faced with new challenges. Current diverse structures of massive SOOC data cannot be stored and retrieved effectively. Moreover, SOOC processing and application platforms are inconvenient to build and deploy, while researchers’ innovative algorithms cannot be applied effectively, thereby limiting the promotion of the research achievements. To provide a scaffolding platform for users with different needs, this paper proposes SOOCP, a SOOC data and analysis service platform based on microservice architecture. Using the hybrid Structured Query Language (SQL)/NoSQL service, the platform provides efficient data storage and retrieval services for users at different levels. For promoting research achievements and reusing existing online services, the proposed heterogeneous function integration service assists researchers and developers in independently integrating algorithmic modules, functional modules, and existing online services to meet high concurrency requests with a unified interface. To evaluate the platform, three research cases with different requirement levels were considered. The results showed that SOOCP performs well by providing various data and function integration services for different levels of demand.


Geophysics ◽  
2019 ◽  
Vol 84 (5) ◽  
pp. C217-C227 ◽  
Author(s):  
Baoqing Tian ◽  
Jiangjie Zhang

High-resolution imaging has become more popular recently in exploration geophysics. Conventionally, geophysicists image the subsurface using the isotropy approximation. When considering the anisotropy effects, one can expect to obtain an imaging profile with higher accuracy than the isotropy approach allows. Orthorhombic anisotropy is considered an ideal approximation in the realistic case. It has been used in the industry for several years. Although being attractive, broad application of orthorhombic anisotropy has many problems to solve. We have developed a novel approach of prestack time migration in the orthorhombic case. The traveltime and amplitude of a wave propagating in orthorhombic media are calculated directly by launching new anisotropic velocity and anisotropic parameters. We validate our methods with synthetic data. We also highlight our methods with model data set and real data. The results found that our methods work well for prestack time migration in orthorhombic media.


2018 ◽  
pp. 45-49
Author(s):  
P. S. Galkin ◽  
V. N. Lagutkin

The algorithm of estimation and compensation of ionosphere influence on the measurement of parameters of the motion of space objects in two-position radar system with account of radio physical effects depending on elevation angles and the operating frequency is developed. It is assumed that the observed space object is traсked object, the orbital parameters which are well known, including the dependence of the velocity of the point on the orbit, and the uncertainty of the current coordinates of the object is caused mainly by forecast error of its position of in orbit (longitudinal error). To estimate the true position of space object in the orbit and the parameter, determining the influence of the ionosphere, a joint optimal processing of measurement of ranges to the object, obtained by two separated radars, taking into account the relevant ionospheric propagation delays and available a priori data on observable object trajectory. Estimation of unknown parameters are obtained on the basis of the criterion of maximum a posteriori probability density for these parameters, taking into account the measured and a priori data. The task of searching for maximum a posteriori probability density is reduced to task of searching of minimum weighted sum of squares, for the solution of which the cascade algorithm of iteration through is implemented in the work. Estimation accuracy of the position of space objects in orbit after compensation of ionosphere influence have been studied by Monte-Carlo method. Dependencies of mean square error of the position estimation of space objects upon elevation angles, operation frequency and solar activity have been obtained. It is shown that the effectiveness of the algorithm increases with the spatial base of measurements (for a fixed orbit of the object).


2020 ◽  
pp. 22-28
Author(s):  
Aleksandr V. Lapko ◽  
Vasiliy A. Lapko

When choosing the optimal number of sampling intervals of the range of values of a one-dimensional random variable, it is established that the functional of the square of the probability density is a constant. The values of the constant are independent of the probability density parameters. The functional dependencies of the studied constant on the coeffi cient of antikurtosis of the distribution law of a random variable are determined. The analysis of the established dependences for families of lognormal probability densities, Student's distribution laws and families of probability densities with Gauss distribution is carried out. Based on the results obtained, a generalized model is formed between the studied constant and the antikurtosis coeffi cient. The generalized model does not depend on the type of probability density, but is determined by the estimation of the antikurtosis coeffi cient. On this basis, we develop a method for estimating the integral of the square of the probability density, which involves the following actions. The random variable interval and the antikurtosis coeffi cient are estimated from the initial sample. At known values of these estimates, the integral of the square of the probability density is calculated. The eff ectiveness of the proposed method is confi rmed by the results of computational experiments. The conditions of the computational experiment diff er signifi cantly from the information used in the synthesis of models of the dependence between the studied constant and the antikurtosis coeffi cient. The conditions of competence of the method of estimating the probability density square integral from the antikurtosis coeffi cient are established using the proposed models of the dependence of the studied constant on the conditions of the computational experiment.


Author(s):  
Olha Oliynyk ◽  
Yurii Taranenko

The error in the identification of the distribution law entails an incorrect assessment of other characteristics (standard deviation, kurtosis, antikurtosis, etc.). The article is devoted to the development of accessible and simple software products for solving problems of identifying distribution laws and determining the optimal size of a data sample. The paper describes a modified method for identifying the law of data distribution by visual analysis of the proximity of histograms with a reduction in the sample size with software implementation. The method allows choosing the most probable distribution law from a wide base of the set. The essence of the method consists in calculating the entropy coefficient and absolute entropy error for the initial and half data sample, determining the optimal method for processing the histogram using visual analysis of the proximity of histograms, and identifying the data distribution law. The experimental data processing model makes it possible to take into account the statistical properties of real data and can be applied to various arrays, and allows to reduce the sample size required for analysis. An automated system for identifying the laws of data distribution with a simple and intuitive interface has been developed. The results of the study on real data indicate an increase in the reliability of the identification of the data distribution law.


Sign in / Sign up

Export Citation Format

Share Document