scholarly journals Systematic Error Correction of Dynamical Seasonal Prediction of Sea Surface Temperature Using a Stepwise Pattern Project Method

2008 ◽  
Vol 136 (9) ◽  
pp. 3501-3512 ◽  
Author(s):  
Jong-Seong Kug ◽  
June-Yi Lee ◽  
In-Sik Kang

Abstract Every dynamical climate prediction model has significant errors in its mean state and anomaly field, thus degrading its performance in climate prediction. In addition to correcting the model’s systematic errors in the mean state, it is also possible to correct systematic errors in the predicted anomalies by means of dynamical or statistical postprocessing. In this study, a new statistical model has been developed based on the pattern projection method in order to empirically correct the dynamical seasonal climate prediction. The strength of the present model lies in the objective and automatic selection of optimal predictor grid points. The statistical model was applied to systematic error correction of SST anomalies predicted by Seoul National University’s (SNU) coupled GCM and evaluated in terms of temporal correlation skill and standardized root-mean-square error. It turns out that the statistical error correction improves the SST prediction over most regions of the global ocean with most forecast lead times up to 6 months. In particular, the SST predictions over the western Pacific and Indian Ocean are improved significantly, where the SNU coupled GCM shows a large error.

Author(s):  
Steven K. Albanese ◽  
John D. Chodera ◽  
Andrea Volkamer ◽  
Simon Keng ◽  
Robert Abel ◽  
...  

AbstractAlchemical free energy calculations are now widely used to drive or maintain potency in small molecule lead optimization with a roughly 1 kcal/mol accuracy. Despite this, the potential to use free energy calculations to drive optimization of compound selectivity among two similar targets has been relatively unexplored in published studies. In the most optimistic scenario, the similarity of binding sites might lead to a fortuitous cancellation of errors and allow selectivity to be predicted more accurately than affinity. Here, we assess the accuracy with which selectivity can be predicted in the context of small molecule kinase inhibitors, considering the very similar binding sites of human kinases CDK2 and CDK9, as well as another series of ligands attempting to achieve selectivity between the more distantly related kinases CDK2 and ERK2. Using a Bayesian analysis approach, we separate systematic from statistical error and quantify the correlation in systematic errors between selectivity targets. We find that, in the CDK2/CDK9 case, a high correlation in systematic errors suggests free energy calculations can have significant impact in aiding chemists in achieving selectivity, while in more distantly related kinases (CDK2/ERK2), the correlation in systematic error suggests fortuitous cancellation may even occur between systems that are not as closely related. In both cases, the correlation in systematic error suggests that longer simulations are beneficial to properly balance statistical error with systematic error to take full advantage of the increase in apparent free energy calculation accuracy in selectivity prediction.


2013 ◽  
Vol 12 (01) ◽  
pp. 1350007 ◽  
Author(s):  
G. GIUSI ◽  
G. SCANDURRA ◽  
C. CIOFI

Spectra estimation in the field of low frequency noise measurements (LFNMs) is almost always performed by resorting to Discrete Fourier Transform (DFT) based spectrum analyzers. In this approach, the input signal is sampled at a proper frequency fs and the power spectrum of sequences of N samples at a time are calculated and averaged in order to obtain an estimate of the spectrum at discrete frequency values fk = kΔf, where the integer k is the frequency index and Δf = fs/N is the frequency resolution. As the number of average increases, the statistical error, which is inversely proportional to the resolution bandwidth, can be made very small. However, if the spectrum of the signal is not a slowly changing function of the frequency, as in the case of 1/fγ processes, spectra estimation by means of the DFT also results in systematic errors. In this paper we discuss the dependence of these errors on spectral parameters (the spectrum amplitude, the frequency f, the spectral exponent γ and the DC power) and on measurement parameters (the spectral window, the resolution bandwidth Δf and the instrumentation AC cutoff frequency). Quantitative expressions for the systematic errors are obtained that, besides helping in the interpretation of the results of actual LFNMs, can be used as a guideline for the optimization of the measurement parameters and/or for the estimation of the maximum accuracy that can be obtained in given experimental conditions. This quantitative analysis is particularly important since while we find that, in general, the systematic error at a given frequency fk = kΔf can be made small if k is made large, which implies that Δf must be much smaller than fk, possibly in contrast with the need for a Δf as large as possible in order to reduce the measurement time, the magnitude of the error depends on the selected spectral window. The role of the instrumentation AC cutoff frequency f AC on the systematic error is also investigated and quantified and it is demonstrated that the error increases as f AC reduces. This last result is very important since, often, f AC is chosen much lower than the frequencies of interest and this choice may result in an increase of the systematic error.


2019 ◽  
Vol 13 (1) ◽  
pp. 14
Author(s):  
Hendro Supratikno ◽  
David Premana

Parking is a condition of not moving a vehicle that is temporary because it was abandoned by the driver. Included in the definition of parking is every vehicle that stops at certain places whether stated by traffic signs or not, and not solely for the benefit of raising and / or lowering people and / or goods.Campus 3 Lumajang State Community Academy has facilities and infrastructure prepared by the Lumajang Regency government. However, the parking lots provided cannot accommodate vehicles optimally because of the ratio of the number of vehicles and the area of the parking area that is not appropriate. This is because the area of the parking lot is not analyzed by data error when measuring.Each measurement data is assumed to have errors both systematic errors, random errors, and large errors (blunders), so that in the measurement of parking lots certainly there are errors. From this the authors intend to conduct research to find out how the propagation of systematic errors and the large systematic errors of the area of campus parking lot 3 Lumajang Community Academy.The methods used in this study include preparing materials and tools, making land sketches, decomposing them, determining distances using theodolite, determining land area equations, and finding systematic error propagation. So that the final goal in this study is to find large systematic errors in the parking area of Campus 3 of the Lumajang State Community Academy


Ocean Science ◽  
2016 ◽  
Vol 12 (5) ◽  
pp. 1067-1090 ◽  
Author(s):  
Marie-Isabelle Pujol ◽  
Yannice Faugère ◽  
Guillaume Taburet ◽  
Stéphanie Dupuy ◽  
Camille Pelloquin ◽  
...  

Abstract. The new DUACS DT2014 reprocessed products have been available since April 2014. Numerous innovative changes have been introduced at each step of an extensively revised data processing protocol. The use of a new 20-year altimeter reference period in place of the previous 7-year reference significantly changes the sea level anomaly (SLA) patterns and thus has a strong user impact. The use of up-to-date altimeter standards and geophysical corrections, reduced smoothing of the along-track data, and refined mapping parameters, including spatial and temporal correlation-scale refinement and measurement errors, all contribute to an improved high-quality DT2014 SLA data set. Although all of the DUACS products have been upgraded, this paper focuses on the enhancements to the gridded SLA products over the global ocean. As part of this exercise, 21 years of data have been homogenized, allowing us to retrieve accurate large-scale climate signals such as global and regional MSL trends, interannual signals, and better refined mesoscale features.An extensive assessment exercise has been carried out on this data set, which allows us to establish a consolidated error budget. The errors at mesoscale are about 1.4 cm2 in low-variability areas, increase to an average of 8.9 cm2 in coastal regions, and reach nearly 32.5 cm2 in high mesoscale activity areas. The DT2014 products, compared to the previous DT2010 version, retain signals for wavelengths lower than  ∼  250 km, inducing SLA variance and mean EKE increases of, respectively, +5.1 and +15 %. Comparisons with independent measurements highlight the improved mesoscale representation within this new data set. The error reduction at the mesoscale reaches nearly 10 % of the error observed with DT2010. DT2014 also presents an improved coastal signal with a nearly 2 to 4 % mean error reduction. High-latitude areas are also more accurately represented in DT2014, with an improved consistency between spatial coverage and sea ice edge position. An error budget is used to highlight the limitations of the new gridded products, with notable errors in areas with strong internal tides.


2018 ◽  
Vol 175 ◽  
pp. 13020 ◽  
Author(s):  
Christopher Kelly

We discuss the ongoing effort by the RBC & UKQCD collaborations to improve our lattice calculation of the measure of Standard Model direct CP violation, ∊’, with physical kinematics. We present our progress in decreasing the (dominant) statistical error and discuss other related activities aimed at reducing the systematic errors.


2005 ◽  
Vol 57 (3) ◽  
pp. 375-386 ◽  
Author(s):  
Philippe Rogel ◽  
Anthony T. Weaver ◽  
Nicolas Daget ◽  
Sophie Ricci ◽  
Eric Machu

2011 ◽  
Vol 4 (4) ◽  
pp. 5147-5182
Author(s):  
V. A. Velazco ◽  
M. Buchwitz ◽  
H. Bovensmann ◽  
M. Reuter ◽  
O. Schneising ◽  
...  

Abstract. Carbon dioxide (CO2) is the most important man-made greenhouse gas (GHG) that cause global warming. With electricity generation through fossil-fuel power plants now as the economic sector with the largest source of CO2, power plant emissions monitoring has become more important than ever in the fight against global warming. In a previous study done by Bovensmann et al. (2010), random and systematic errors of power plant CO2 emissions have been quantified using a single overpass from a proposed CarbonSat instrument. In this study, we quantify errors of power plant annual emission estimates from a hypothetical CarbonSat and constellations of several CarbonSats while taking into account that power plant CO2 emissions are time-dependent. Our focus is on estimating systematic errors arising from the sparse temporal sampling as well as random errors that are primarily dependent on wind speeds. We used hourly emissions data from the US Environmental Protection Agency (EPA) combined with assimilated and re-analyzed meteorological fields from the National Centers of Environmental Prediction (NCEP). CarbonSat orbits were simulated as a sun-synchronous low-earth orbiting satellite (LEO) with an 828-km orbit height, local time ascending node (LTAN) of 13:30 (01:30 p.m.) and achieves global coverage after 5 days. We show, that despite the variability of the power plant emissions and the limited satellite overpasses, one CarbonSat can verify reported US annual CO2 emissions from large power plants (≥5 Mt CO2 yr−1) with a systematic error of less than ~4.9 % for 50 % of all the power plants. For 90 % of all the power plants, the systematic error was less than ~12.4 %. We additionally investigated two different satellite configurations using a combination of 5 CarbonSats. One achieves global coverage everyday but only samples the targets at fixed local times. The other configuration samples the targets five times at two-hour intervals approximately every 6th day but only achieves global coverage after 5 days. From the statistical analyses, we found, as expected, that the random errors improve by approximately a factor of two if 5 satellites are used. On the other hand, more satellites do not result in a large reduction of the systematic error. The systematic error is somewhat smaller for the CarbonSat constellation configuration achieving global coverage everyday. Finally, we recommend the CarbonSat constellation configuration that achieves daily global coverage.


Author(s):  
Yaqiong Wang ◽  
Ke Xu ◽  
Shaomin Li

In recent years, with rapid industrialization and massive energy consumption, ground-level ozone ( O 3 ) has become one of the most severe air pollutants. In this paper, we propose a functional spatio-temporal statistical model to analyze air quality data. Firstly, since the pollutant data from the monitoring network usually have a strong spatial and temporal correlation, the spatio-temporal statistical model is a reasonable method to reveal spatial correlation structure and temporal dynamic mechanism in data. Secondly, effects from the covariates are introduced to explore the formation mechanism of ozone pollution. Thirdly, considering the obvious diurnal pattern of ozone data, we explore the diurnal cycle of O 3 pollution using the functional data analysis approach. The spatio-temporal model shows great applicational potential by comparison with other models. With application to O 3 pollution data of 36 stations in Beijing, China, we give explanations of the covariate effects on ozone pollution, such as other pollutants and meteorological variables, and meanwhile we discuss the diurnal cycle of ozone pollution.


Sign in / Sign up

Export Citation Format

Share Document