scholarly journals INSIGHT INTO SPATIAL ANALYSIS OF GPS BROADCAST ORBIT ERROR TOWARDS ORBITAL IMPROVEMENT

Author(s):  
H. S. Lee ◽  
T. A. Musa ◽  
W. A. Wan Aris ◽  
A. Z. Sha’ameri

Abstract. Broadcast orbits are compared against final orbit to get the error of broadcast orbit. The errors are analysed by presenting the error over space, especially longitude. The satellite trajectory is divided into three sector namely northern, southern, and transitional sectors. Spatial analysis show that the error is correlated with the latitude and longitude. Some consistency pattern can be observed from the distribution of the error in the spatial analysis. Standard deviation (SD) is used to quantify the consistency, providing more quantitative insights into the spatial analysis. Four patterns can be observed in the error distribution, namely consistency in northern and southern sector, consistency of transitional sector, changes after transitional sector, and correlation between ΔX component and ΔY component. The spatial analysis shows potential to be used in broadcast orbit error estimation and prediction. A model that uses this predicted broadcast orbit error as a correction will be designed in the future to improve the broadcast orbit accuracy.

Author(s):  
Andrew Curtis ◽  
Michael Leitner

n the opening chapters, GIS was broken into four general components, one of which was the spatial analysis of data. This is probably the least utilized of all GIS functions outside of an academic environment. A point that is often missed when discussing GIS is that the technology often exceeds the capabilities of the user. This is especially true if the user has not received any academic training in spatial data and GIS use. In Chapter VI a more sophisticated overview will be presented of the latest spatial analysis techniques along with examples of their implementation. Although the number of “spatially” trained scientists continues to grow, there is still a gap between the number of available skilled GIS modelers and the community programs needing GIS analysis. This chapter is designed to provide a stopgap approach, using more simple spatial statistical approaches that can be applied to gain a reasonable first insight into a birth outcome surface.


2014 ◽  
Vol 3 (1) ◽  
pp. 60-68 ◽  
Author(s):  
Justyna Smolarek ◽  
Leszek Marynowski ◽  
Wiesław Trela

Abstract The aim of this research is to reconstruct palaeoredox conditions during sedimentation of the Jeleniów Claystone Formation deposits, using framboid pyrite diameter measurements. Analysis of pyrite framboids diameter distribution is an effective method in the palaeoenvironmental interpretation which allow for a more detailed insight into the redox conditions, and thus the distinction between euxinic, dysoxic and anoxic conditions. Most of the samples is characterized by framboid indicators typical for anoxic/euxinic conditions in the water column, with average (mean) values ranging from 5.29 to 6.02 μm and quite low standard deviation (SD) values ranging from 1.49 to 3.0. The remaining samples have shown slightly higher values of framboid diameter typical for upper dysoxic conditions, with average values (6.37 to 7.20 μm) and low standard deviation (SD) values (1.88 to 2.88). From the depth of 75.5 m till the shallowest part of the Jeleniów Claystone Formation, two samples have been examined and no framboids has been detected. Because secondary weathering should be excluded, the lack of framboids possibly indicates oxic conditions in the water column. Oxic conditions continue within the Wólka Formation based on the lack of framboids in the ZB 51.6 sample


2015 ◽  
Vol 733 ◽  
pp. 982-985 ◽  
Author(s):  
Wu Yan Fu ◽  
Yong Zhou

This article borrows ArcGIS10.0 software, which will be spatial analysis model standard deviation ellipse (Standard deviational ellipse, SDE) used to study the Guanzhong - Tianshui Economic Zone of each urban high-tech industrial clusters, each cluster to explore spatial evolution of the economic trends in the region, Empirical Study of ArcGIS software applications in a cluster aspects.


1978 ◽  
Vol 42 (3_suppl) ◽  
pp. 1071-1074
Author(s):  
Joseph C. Bledsoe

Analysis of responses from end-of-course evaluations of 24 graduate statistics classes taught by the author on a 26-item evaluation form yielded highly reliable evaluations for both classes (.93) and items (.96). Reliable differences were found for six consecutive stages of four classes each with strong positive trend and weak but significant cubic trend apparent. Items accounted for .423 of the variance with stage explaining .166 and interaction of stage and item .085. A stable hierarchy of item characteristics indicated consensus with mean rating correlating —.76 with standard deviation. Most favorable ratings were given on instructor-student relations, motivation-stimulation, reasonable work load and tests, and clearness of grading procedures, with least favorable ratings on subject organization and competence.


2016 ◽  
Vol 62 (236) ◽  
pp. 1181-1185
Author(s):  
PAUL MUZIKAR

AbstractIf a series of glacial advances occurs over the same pathway, the moraines that are now present may constitute an incomplete record of the total history. This is because a given advance can destroy the moraine left by a previous one, if the previous advance was less extensive. Gibbons, Megeath and Pierce (GMP) formulated an elegant stochastic model for this process; the key quantity in their analysis is $\bi P(n\vert N)$, the probability that n moraines are preserved after N glacial advances. In their paper, GMP derive a recursion formula satisfied by $\bi P(n\vert N)$, and use this formula to compute values of P for a range of values of n and N. In the present paper, we derive an explicit general answer for $\bi P(n\vert N)$, and show explicit, exact results for the mean value and standard deviation of n. We use these results to develop more insight into the consequences of the GMP model; for example, to a good approximation, 〈n〉 increases as ln(N). We explain how a Bayesian approach can be used to analyze $\bi P(N\vert n)$, the probability that there were N advances, given that we now observe n moraines.


1979 ◽  
Vol 25 (3) ◽  
pp. 394-400 ◽  
Author(s):  
J O Westgard ◽  
H Falk ◽  
T Groth

Abstract A computer-stimulation study has been performed to determine how the performance characteristics of quality-control rules are affected by the presence of a between-run component of variation, the choice of control limits (calculated from within-run vs. total standard deviations), and the shape of the error distribution. When a between-run standard deviation (Sb) exists and control limits are calculated from the total standard deviation (St, which includes Sb as well as the within-run standard deviation, Sw), there is generally a loss in ability to detect analytical disturbances or errors. With control limits calculated from Sw, there is generally an increase in the level of false rejections. The presence of non-gaussian error distribution appears to have considerably less effect. It can be recommended that random error be controlled by use of a chi-square or range-control rule, with control limits calculated from Sw. Optimal control of systematic errors is difficult when Sb exists. An effort should be made to reduce Sb, and this will lead to increased ability to detect analytical errors. When Sb is tolerated or accepted as part of the baseline state of operation for the analytical method, then further increases in the number of control observations will be necessary to achieve a given probability for error detection.


2019 ◽  
pp. 136248061987162
Author(s):  
Meirav Aharon-Gutman

This article offers exploration of one spatial aspect of crime in the divided city: the disproportionate concentration of crime events along Jerusalem’s former socio-historical border (known as ‘Green Line’) that is clearly reflected in a spatial analysis of crime. Offering insight into this phenomenon, an ethnographic investigation reveals the manner in which neighbourhood residents cope with crime by blocking entry to it from the east, thereby reinforcing and reproducing already existing urban divisions. This second, qualitative layer of research enables us to follow urban boundary work in action, which is important, as focusing on boundary work (as opposed to borders) offers insight not only into divided cities as fact but into the mechanisms, logic and culture that reproduce and reshape their urban divisions. In contrast to hegemonic analyses that highlight the importance of macro-politics in shaping the lines that divide the divided city, this article considers crime, and the way residents struggle against it from below, as a central mechanism that reinforces and reproduces the divisions of the divided city.


2008 ◽  
Vol 5 (5) ◽  
pp. 1311-1324 ◽  
Author(s):  
G. Lasslop ◽  
M. Reichstein ◽  
J. Kattge ◽  
D. Papale

Abstract. Eddy covariance data are increasingly used to estimate parameters of ecosystem models. For proper maximum likelihood parameter estimates the error structure in the observed data has to be fully characterized. In this study we propose a method to characterize the random error of the eddy covariance flux data, and analyse error distribution, standard deviation, cross- and autocorrelation of CO2 and H2O flux errors at four different European eddy covariance flux sites. Moreover, we examine how the treatment of those errors and additional systematic errors influence statistical estimates of parameters and their associated uncertainties with three models of increasing complexity – a hyperbolic light response curve, a light response curve coupled to water fluxes and the SVAT scheme BETHY. In agreement with previous studies we find that the error standard deviation scales with the flux magnitude. The previously found strongly leptokurtic error distribution is revealed to be largely due to a superposition of almost Gaussian distributions with standard deviations varying by flux magnitude. The crosscorrelations of CO2 and H2O fluxes were in all cases negligible (R2 below 0.2), while the autocorrelation is usually below 0.6 at a lag of 0.5 h and decays rapidly at larger time lags. This implies that in these cases the weighted least squares criterion yields maximum likelihood estimates. To study the influence of the observation errors on model parameter estimates we used synthetic datasets, based on observations of two different sites. We first fitted the respective models to observations and then added the random error estimates described above and the systematic error, respectively, to the model output. This strategy enables us to compare the estimated parameters with true parameters. We illustrate that the correct implementation of the random error standard deviation scaling with flux magnitude significantly reduces the parameter uncertainty and often yields parameter retrievals that are closer to the true value, than by using ordinary least squares. The systematic error leads to systematically biased parameter estimates, but its impact varies by parameter. The parameter uncertainty slightly increases, but the true parameter is not within the uncertainty range of the estimate. This means that the uncertainty is underestimated with current approaches that neglect selective systematic errors in flux data. Hence, we conclude that potential systematic errors in flux data need to be addressed more thoroughly in data assimilation approaches since otherwise uncertainties will be vastly underestimated.


Sign in / Sign up

Export Citation Format

Share Document