scholarly journals APPLICATION OF STATISTICAL METHODS TO ASSESS THE METROLOGICAL CHARACTERISTICS OF RADIO-HOLOGRAPHIC MEASURING COMPLEXES

2018 ◽  
Vol 9 (2) ◽  
pp. 173-181
Author(s):  
A. P. Grinchuk ◽  
A. G. Buday ◽  
A. V. Gromyko

Practical application of the radio-holographic method for measuring the characteristics of antennas, especially when conducting acceptance testing of systems, requires an adequate assessment of the errors in the recovery of long-range characteristics. These errors appear to be a superposition composed of various sources, having different nature, different time characteristics and different degrees of influence on the final result. The purpose of this work was the development of a practical technique for determining the influence of random errors in measuring the amplitude-phase distribution of the field of the antenna required for the accuracy of restoring long-range characteristics (primarily the antenna pattern) of the antenna, the proposed technique being based only on processing the experimental results obtained with this measuring complex.A practical method for determining the influence of random errors in measuring the amplitude-phase distribution of the field of the antenna under study on the accuracy of restoring its long-range characteristics (primarily the directional pattern) on the basis of correlation and spectral analysis has been developed and experimentally confirmed. The main advantage of the developed method in comparison with the use of mathematical modeling is that the estimation of the accuracy of the reconstruction of the directivity diagrams is based on the results of processing experimental data obtained on a specific measuring complex and does not a priori impose any preliminary requirements on the statistical parameters of errors. The developed procedure for estimating the influence of random errors can be used to develop a methodology for metrological certification of measuring systems as measuring instruments.

2017 ◽  
Vol 17 (11) ◽  
pp. 6663-6678 ◽  
Author(s):  
Shreeya Verma ◽  
Julia Marshall ◽  
Mark Parrington ◽  
Anna Agustí-Panareda ◽  
Sebastien Massart ◽  
...  

Abstract. Airborne observations of greenhouse gases are a very useful reference for validation of satellite-based column-averaged dry air mole fraction data. However, since the aircraft data are available only up to about 9–13 km altitude, these profiles do not fully represent the depth of the atmosphere observed by satellites and therefore need to be extended synthetically into the stratosphere. In the near future, observations of CO2 and CH4 made from passenger aircraft are expected to be available through the In-Service Aircraft for a Global Observing System (IAGOS) project. In this study, we analyse three different data sources that are available for the stratospheric extension of aircraft profiles by comparing the error introduced by each of them into the total column and provide recommendations regarding the best approach. First, we analyse CH4 fields from two different models of atmospheric composition – the European Centre for Medium-Range Weather Forecasts (ECMWF) Integrated Forecasting System for Composition (C-IFS) and the TOMCAT/SLIMCAT 3-D chemical transport model. Secondly, we consider scenarios that simulate the effect of using CH4 climatologies such as those based on balloons or satellite limb soundings. Thirdly, we assess the impact of using a priori profiles used in the satellite retrievals for the stratospheric part of the total column. We find that the models considered in this study have a better estimation of the stratospheric CH4 as compared to the climatology-based data and the satellite a priori profiles. Both the C-IFS and TOMCAT models have a bias of about −9 ppb at the locations where tropospheric vertical profiles will be measured by IAGOS. The C-IFS model, however, has a lower random error (6.5 ppb) than TOMCAT (12.8 ppb). These values are well within the minimum desired accuracy and precision of satellite total column XCH4 retrievals (10 and 34 ppb, respectively). In comparison, the a priori profile from the University of Leicester Greenhouse Gases Observing Satellite (GOSAT) Proxy XCH4 retrieval and climatology-based data introduce larger random errors in the total column, being limited in spatial coverage and temporal variability. Furthermore, we find that the bias in the models varies with latitude and season. Therefore, applying appropriate bias correction to the model fields before using them for profile extension is expected to further decrease the error contributed by the stratospheric part of the profile to the total column.


Geophysics ◽  
2010 ◽  
Vol 75 (1) ◽  
pp. H1-H6
Author(s):  
Bruno Goutorbe ◽  
Violaine Combier

In the frame of 3D seismic acquisition, reconstructing the shape of the streamer(s) for each shot is an essential step prior to data processing. Depending on the survey, several kinds of constraints help achieve this purpose: local azimuths given by compasses, absolute positions recorded by global positioning system (GPS) devices, and distances calculated between pairs of acoustic ranging devices. Most reconstruction methods are restricted to work on a particular type of constraint and do not estimate the final uncertainties. The generalized inversion formalism using the least-squares criterion can provide a robust framework to solve such a problem — handling several kinds of constraints together, not requiring an a priori parameterization of the streamer shape, naturally extending to any configuration of streamer(s), and giving rigorous uncertainties. We explicitly derive the equations governing the algorithm corresponding to a marine seismic survey using a single streamer with compasses distributed all along it and GPS devices located on the tail buoy and on the vessel. Reconstruction tests conducted on several synthetic examples show that the algorithm performs well, with a mean error of a few meters in realistic cases. The accuracy logically degrades if higher random errors are added to the synthetic data or if deformations of the streamer occur at a short length scale.


1976 ◽  
Vol 66 (1) ◽  
pp. 173-187
Author(s):  
Ray Buland

abstract A complete reexamination of Geiger's method in the light of modern numerical analysis indicates that numerical stability can be insured by use of the QR algorithm and the convergence domain considerably enlarged by the introduction of step-length damping. In order to make the maximum use of all data, the method is developed assuming a priori estimates of the statistics of the random errors at each station. Numerical experiments indicate that the bulk of the joint probability density of the location parameters is in the linear region allowing simple estimates of the standard errors of the parameters. The location parameters are found to be distributed as one minus chi squared with m degrees of freedom, where m is the number of parameters, allowing the simple construction of confidence levels. The use of the chi-squared test with n-m degrees of freedom, where n is the number of data, is introduced as a means of qualitatively evaluating the correctness of the earth model.


2010 ◽  
Vol 20-23 ◽  
pp. 346-351
Author(s):  
Ke Qiang Dong ◽  
Peng Jian Shang ◽  
Hong Zhang

We propose a new method called the multi-dependent Hurst exponent to investigate the correlation properties of the nonstationary time series. The method is validated with the artificial series including both short-range correlated data and long-range correlated data. The results indicate that the multi-dependent Hurst exponents fluctuate around the a-priori known correlation exponent H. Application to traffic time series is also presented, and comparison is made between the artificial time series and traffic time series.


Dependability ◽  
2017 ◽  
Vol 17 (4) ◽  
pp. 23-26
Author(s):  
A. Yu. Kolobov ◽  
E. V. Dikoun

Aim. Many space technology products fall into the category of one-off (unique) or are manufactured in small batches of 3 to 5. In accordance with the regulatory documentation, the design and development activities in the space industry must involve quality assurance of products with interval estimation of dependability indicators. However, for one-off unique spacecraft that account for a fair share of the overall space industry product output, acquiring such estimates is associated with the problem of availability of original statistical data. That is due to the high cost of both the spacecraft itself and its testing, which does not allow testing large numbers of samples in the process of spacecraft development. In the context of restricted funding of the space industry, a practice has arisen that involves conduction each planned type of test on a single sample. The test samples have different configurations and versions of components (dimension and mass models, thermal analogues, etc.). In this case, it is impossible to acquire homogeneous statistical data in order to substantiate the compliance with the dependability requirements.Results. The article proposes a method of interval estimation of the probability of no-failure of a one-off spacecraft based on the results of flight tests using a priori information acquired at the stage of pre-delivery and acceptance testing. The authors compare the feasibility of using computational, experimental or computational and experimental methods of spacecraft dependability indicators evaluation. As initial data, electric and radio engineering and thermal vacuum tests results of spacecraft flight model are used. The fact that only the electric and radio engineering tests results are taken into consideration is due to the dependability of spacecraft being primarily defined by the dependability of the electronic equipment. The scope of tests (normally, about 50 for each spacecraft) allows obtaining highly reliable and informative estimates. This method can also be used at the stage of operation in order to evaluate and supervise dependability, e.g. after a year of operation. The correctness of aggregation of the a priori information and the information obtained at the said stage is verified with Fisher’s Z-value.Conclusions. The proposed method allows estimating pointwise values of probability of no-failure of one-off spacecraft, lower confidence bounds and mean-square deviation of the probability of no-failure at the stages of pre-delivery and acceptance testing, flight testing and operation using a priori information. An example is given of interval estimation of probability of no-failure of one-off spacecraft based on the results of flight operations using a priori information obtained at the stages of pre-delivery and acceptance testing.


2018 ◽  
Vol 6 (2) ◽  
pp. 431-450 ◽  
Author(s):  
Bradley A. Weymer ◽  
Phillipe Wernette ◽  
Mark E. Everett ◽  
Chris Houser

Abstract. Shorelines exhibit long-range dependence (LRD) and have been shown in some environments to be described in the wave number domain by a power-law characteristic of scale independence. Recent evidence suggests that the geomorphology of barrier islands can, however, exhibit scale dependence as a result of systematic variations in the underlying framework geology. The LRD of framework geology, which influences island geomorphology and its response to storms and sea level rise, has not been previously examined. Electromagnetic induction (EMI) surveys conducted along Padre Island National Seashore (PAIS), Texas, United States, reveal that the EMI apparent conductivity (σa) signal and, by inference, the framework geology exhibits LRD at scales of up to 101 to 102 km. Our study demonstrates the utility of describing EMI σa and lidar spatial series by a fractional autoregressive integrated moving average (ARIMA) process that specifically models LRD. This method offers a robust and compact way of quantifying the geological variations along a barrier island shoreline using three statistical parameters (p, d, q). We discuss how ARIMA models that use a single parameter d provide a quantitative measure for determining free and forced barrier island evolutionary behavior across different scales. Statistical analyses at regional, intermediate, and local scales suggest that the geologic framework within an area of paleo-channels exhibits a first-order control on dune height. The exchange of sediment amongst nearshore, beach, and dune in areas outside this region are scale independent, implying that barrier islands like PAIS exhibit a combination of free and forced behaviors that affect the response of the island to sea level rise.


Geophysics ◽  
1986 ◽  
Vol 51 (11) ◽  
pp. 2051-2066 ◽  
Author(s):  
Hiroshi Inoue

A new method of multivariate smooth fitting of scattered, noisy data using cubic B-splines was developed. An optimum smoothing function was defined to minimize the [Formula: see text] norm composed of the data residuals and the first and the second derivatives, which represent the total misfit, fluctuation, and roughness of the function, respectively. The function is approximated by a cubic B‐spline expansion with equispaced knots. The solution can be interpreted in three ways. From the stochastic viewpoint, it is the maximum‐likelihood estimate among the admissible functions under the a priori information that the first and second derivatives are zero everywhere due to random errors, i.e., white noise. From the physical viewpoint, it is the finite‐element approximation for a lateral displacement of a bar or a plate under tension which is pulled to the data points by springs. From a technical viewpoint, it is an improved spline‐fitting algorithm. The additional condition of minimizing the derivative norms stabilizes the linear equation system for the expansion coefficients.


2004 ◽  
Vol 11 (4) ◽  
pp. 495-503 ◽  
Author(s):  
D. Maraun ◽  
H. W. Rust ◽  
J. Timmer

Abstract. We study the inference of long-range correlations by means of Detrended Fluctuation Analysis (DFA) and argue that power-law scaling of the fluctuation function and thus long-memory may not be assumed a priori but have to be established. This requires the investigation of the local slopes. We account for the variability characteristic for stochastic processes by calculating empirical confidence regions. Comparing a long-memory with a short-memory model shows that the inference of long-range correlations from a finite amount of data by means of DFA is not specific. We remark that scaling cannot be concluded from a straight line fit to the fluctuation function in a log-log representation. Furthermore, we show that a local slope larger than α=0.5 for large scales does not necessarily imply long-memory. We also demonstrate, that it is not valid to conclude from a finite scaling region of the fluctuation function to an equivalent scaling region of the autocorrelation function. Finally, we review DFA results for the Prague temperature data set and show that long-range correlations cannot not be concluded unambiguously.


2007 ◽  
Vol 20 (15) ◽  
pp. 4047-4062 ◽  
Author(s):  
Steven C. Sherwood

Abstract All instrumental climate records are affected by instrumentation changes and variations in sampling over time. While much attention has been paid to the problem of detecting “change points” in time series, little has been paid to the statistical properties of climate signals that result after adjusting (“homogenizing”) the data—or to the effects of the irregular sampling and serial correlation exhibited by real climate records. These issues were examined here by simulating multistation datasets. Simple homogenization methods, which remove apparent artifacts and then calculate trends, tended to remove some of the real signal. That problem became severe when change-point times were not known a priori, leading to significant underestimation of real and/or artificial trends. A key cause is false detection of change points, even with nominally strict significance testing, due to serial correlation in the data. One conclusion is that trends in previously homogenized radiosonde datasets should be viewed with caution. Two-phase regression reduced but did not resolve this problem. A new approach is proposed in which trends, change points, and natural variability are estimated simultaneously. This is accomplished here for the case of incomplete data from a fixed station network by an adaptation of the “iterative universal Kriging” method, which converges to maximum-likelihood parameters by iterative imputation of missing values. With careful implementation this method’s trend estimates had low random errors and were nearly unbiased in these tests. It is argued that error-free detection of change points is neither realistic nor necessary, and that success should be measured instead by the integrity of climate signals.


Sign in / Sign up

Export Citation Format

Share Document