stochastic errors
Recently Published Documents


TOTAL DOCUMENTS

44
(FIVE YEARS 9)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Vol 8 (3) ◽  
pp. 60-67
Author(s):  
Halyna Voznyak ◽  
Taras Kloba

In current conditions, budget security is one of the essential components of financial security. The budget is the leading financial plan of the state and reflects most of the economic processes in the country; it also redistributes and accumulates revenues, provides financing of vital costs. Proper use of the budget as a financial basis for state regulation of a market economy should ensure the rule of economic and social processes in the interests of society to create conditions for the economic development of the country. Оn the one hand, and Budget security reflects the regularities of the functioning of the budget as an objective reality and, on the other hand, the subjective manifestations of human activity that find their expression in fiscal policy.The article aims to substantiate budget security's theoretical and conceptual basis at the regional and local levels.The article examines the theoretical essence of budget security at the regional and local levels. First, scientific approaches to the definition of "budget security" are generalized. Second, budget risks are considered, which under the influence of certain factors can be transformed into threats to budget security (discrete control; insolvency of taxpayers; political, military actions; destabilization of the financial sector of the region/communities; imbalance of revenues and expenditures; structure of revenues and expenses; external dependence violation of the stability of payments, debt growth (internal/external), stochastic errors, corrupt growth of the shadow sector, irrational budgeting). Third, the theoretical conceptualization of the concept of "threat", their essence and characteristics are substantiated. Fourth, understanding the nature of the concept of "budget risk" and "threat to budget security" is investigated. Finally, the functions and tasks that the budget security of territories should perform in the conditions of increasing financial and economic instability are considered.Coverage of the theoretical and conceptual basis of budget security of regions and communities, specification of existing and potential factors and risks of budget security, functions and tasks of budget security will minimize and neutralize various aspects and threats in budget security and increase budget security at regional and local levels and achieve important strategic guidelines.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Yingkai Ouyang

AbstractCoherent errors, which arise from collective couplings, are a dominant form of noise in many realistic quantum systems, and are more damaging than oft considered stochastic errors. Here, we propose integrating stabilizer codes with constant-excitation codes by code concatenation. Namely, by concatenating an [[n, k, d]] stabilizer outer code with dual-rail inner codes, we obtain a [[2n, k, d]] constant-excitation code immune from coherent phase errors and also equivalent to a Pauli-rotated stabilizer code. When the stabilizer outer code is fault-tolerant, the constant-excitation code has a positive fault-tolerant threshold against stochastic errors. Setting the outer code as a four-qubit amplitude damping code yields an eight-qubit constant-excitation code that corrects a single amplitude damping error, and we analyze this code’s potential as a quantum memory.


Author(s):  
Yoo-Ah Kim ◽  
Mark D.M. Leiserson ◽  
Priya Moorjani ◽  
Roded Sharan ◽  
Damian Wojtowicz ◽  
...  

Mutations are the driving force of evolution, yet they underlie many diseases, in particular, cancer. They are thought to arise from a combination of stochastic errors in DNA processing, naturally occurring DNA damage (e.g., the spontaneous deamination of methylated CpG sites), replication errors, and dysregulation of DNA repair mechanisms. High-throughput sequencing has made it possible to generate large datasets to study mutational processes in health and disease. Since the emergence of the first mutational process studies in 2012, this field is gaining increasing attention and has already accumulated a host of computational approaches and biomedical applications. Expected final online publication date for the Annual Review of Biomedical Data Science, Volume 4 is July 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1327
Author(s):  
Ahmed A. Youssef ◽  
Naif Al-Subaie ◽  
Naser El-Sheimy ◽  
Mohamed Elhabiby

Various high budget industries that utilize wheel-based vehicles rely on wheel odometry as an integral aspect of their navigation process. This research introduces a low-cost alternative for typical wheel encoders that are typically used to determine the on-track speed of vehicles. The proposed system is referred to as an Accelerometer-based Wheel Odometer for Kinematics determination (AWOK). The AWOK system comprises just a single axis accelerometer mounted radially at the center of any given wheel. The AWOK system can provide direct distances instead of just velocities, which are provided by typical wheel speedometers. Hence, the AWOK system is advantageous in comparison to typical wheel odometers. Besides, the AWOK system comprises a simple assembly with a highly efficient data processing algorithm. Additionally, the AWOK system provides a high capacity to handle high dynamics in comparison to similar approaches found in previous related work. Furthermore, the AWOK system is not affected by the inherited stochastic errors in micro-machined electro-mechanical systems (MEMS) inertial sensors, whether short-term or long-term errors. Above all, the AWOK system reported a relative accuracy of 0.15% in determining the distance covered by a car.


2021 ◽  
Vol 12 (1) ◽  
pp. 251-281 ◽  
Author(s):  
Yves Breitmoser

Experimenters make theoretically irrelevant decisions concerning user interfaces and ordering or labeling of options. Reanalyzing dictator games, I first show that such decisions may drastically affect comparative statics and cause results to appear contradictory across experiments. This obstructs model testing, preference analyses, and policy predictions. I then propose a simple model of choice incorporating both presentation effects and stochastic errors, and test the model by reanalyzing the dictator game experiments. Controlling for presentation effects, preference estimates become consistent across experiments and predictive out‐of‐sample. This highlights both the necessity and the possibility to control for presentation in economic experiments.


Author(s):  
Zaur A. Alderov ◽  
Evgeny V. Rozengauz ◽  
Denis Nesterov

One of the the widely used way to follow up oncological disease is estimation of lesion size differences. Volumetry is one of the most accurate approaches of lesion size estimation. However, being highly sensitive, volumetric errors can reach 60%, which significantly limits the applicability of the method. Purpose was to estimate the effect of reconstruction parameters on volumetry error. Materials and methods. 32 patients with pulmonary metastases underwent a CT scanning with 326 foci detected. 326 pulmonary were segmented. Volumetry error was estimated for every lesion with each combination of slice thickness and reconstruction kernel. The effect was measured with linear regression analysis Results. Systematic and stochastic errors are impacted by slice thickness, reconstruction kernel, lesion position and its diameter. FC07 kernel and larger slice thickness is associated with high systematic error. Both systematic and stochastic errors decrease with lesion enlargment. intrapulmonary lesions have the lowest error regardless the reconstruction parameters. Lineal regression model was created to prognose error rate. Model standart error was 6.7%. There was corelation between model remnants deviation and slice thickness, reconstruction kernel, lesion position and its diameter. Conclusion. The systematic error depends on the focal diameter, slice thickness and reconstruction kernel. It can be estimated using the proposed model with a 6% error. Stochastic error mainly depends on lesion size.


2020 ◽  
Vol 2020 ◽  
pp. 1-21
Author(s):  
Luhua Zhu ◽  
Erlei Yao

This paper is an extension of the random amplitude-based improved Hilbert spectral representation method (IHSRM) that the authors developed previously for the simulation of spatially correlated earthquake ground motions (SCEGMs) possessing the nonstationary characteristics of the natural earthquake record. In fact, depending on the fundamental types (random phase method and random amplitude method) and matrix decomposition methods (Cholesky decomposition, root decomposition, and eigendecomposition), the IHSRM possesses various types. To evaluate the influence of different types of this method on the statistic errors, i.e., bias errors and stochastic errors, an error assessment for this method was conducted. First, the random phase-based IHSRM was derived, and its reliability was proven by theoretical deduction. Unified formulas were given for random phase- and random amplitude-based IHSRMs, respectively. Then, the closed-form solutions of statistic errors of simulated seismic motions were derived. The validness of the proposed closed-form solutions was proven by comparing the closed-form solutions with estimated values. At last, the stochastic errors of covariance (i.e., variance and cross-covariance) for different types of IHSRMs were compared, and the results showed that (1) the proposed IHSRM is not ergodic; (2) the random amplitude-based IHSRMs possessed higher stochastic errors of covariance than the random phase-based IHSRMs; and (3) the value of the stochastic error of covariance for the random phase-based IHSRM is dependent on the matrix decomposition method, while that for the random amplitude-based one is not.


2019 ◽  
Vol 12 (4) ◽  
pp. 177 ◽  
Author(s):  
Thibaut Denoël ◽  
Luca Pedrelli ◽  
Giuseppe Pantaleo ◽  
John O. Prior

The immunoreactive fraction r provides important information on the functional purity of radiolabeled proteins. It is traditionally determined by saturating the radioimmunoconjugate with an increasing excess of antigen, followed by linear extrapolation to infinite antigen excess in a double inverse “Lindmo plot”. Although several reports have described shortcomings in the Lindmo plot, a systematic examination is lacking. Using an experimental and simulation-based approach, we compared—for accuracy, precision and robustness—the Lindmo plot with the “rectangular hyperbola” extrapolation method based on the Langmuir model. The differences between the theoretical and extrapolated r values demonstrate that nonequilibrium and antigen depletion are important sources of error. The mathematical distortions resulting from the linearization of the data in the Lindmo plot induce fragility towards stochastic errors and make it necessary to exclude low bound fractions. The rectangular hyperbola provides robust and precise r estimates from raw binding data, even for slow kinetics.


Ocean Science ◽  
2019 ◽  
Vol 15 (2) ◽  
pp. 249-268 ◽  
Author(s):  
Johannes Schulz-Stellenfleth ◽  
Joanna Staneva

Abstract. In many coastal areas there is an increasing number and variety of observation data available, which are often very heterogeneous in their temporal and spatial sampling characteristics. With the advent of new systems, like the radar altimeter on board the Sentinel-3A satellite, a lot of questions arise concerning the accuracy and added value of different instruments and numerical models. Quantification of errors is a key factor for applications, like data assimilation and forecast improvement. In the past, the triple collocation method to estimate systematic and stochastic errors of measurements and numerical models was successfully applied to different data sets. This method relies on the assumption that three independent data sets provide estimates of the same quantity. In coastal areas with strong gradients even small distances between measurements can lead to larger differences and this assumption can become critical. In this study the triple collocation method is extended in different ways with the specific problems of the coast in mind. In addition to nearest-neighbour approximations considered so far, the presented method allows for use of a large variety of interpolation approaches to take spatial variations in the observed area into account. Observation and numerical model errors can therefore be estimated, even if the distance between the different data sources is too large to assume that they measure the same quantity. If the number of observations is sufficient, the method can also be used to estimate error correlations between certain data source components. As a second novelty, an estimator for the uncertainty in the derived observation errors is derived as a function of the covariance matrices of the input data and the number of available samples. In the first step, the method is assessed using synthetic observations and Monte Carlo simulations. The technique is then applied to a data set of Sentinel-3A altimeter measurements, in situ wave observations, and numerical wave model data with a focus on the North Sea. Stochastic observation errors for the significant wave height, as well as bias and calibration errors, are derived for the model and the altimeter. The analysis indicates a slight overestimation of altimeter wave heights, which become more pronounced at higher sea states. The smallest stochastic errors are found for the in situ measurements. Different observation geometries of in situ data and altimeter tracks are furthermore analysed, considering 1-D and 2-D interpolation approaches. For example, the geometry of an altimeter track passing between two in situ wave instruments is considered with model data being available at the in situ locations. It is shown that for a sufficiently large sample, the errors of all data sources, as well as the error correlations of the model, can be estimated with the new method.


2018 ◽  
Author(s):  
xuejun guo ◽  
Dong Yang ◽  
Xiangyuan Zhang

Although the phenomenal relationship between epigenetics and aging phenotypic changes is built up, an intrinsic connection between the epigenetics and aging requires to be theoretically illuminated. In this study, we propose epigenetic recording of varied cell environment and complex history could be an origin of cellular aging. Through epigenetic modifications, the environment and historical events can induce the chromatin template into activated or repressive accessible structure, thereby shaping the DNA template into a spectrum of chromatin states. The inner nature of diversity and conflicts born by cell environment and its historical events are hence recorded into the chromatin template. This could result in a dissipated spectrum of chromatin state and chaos of overall gene expressions. An unavoidable degradation of epigenome entropy, similar to <i>Shannon</i> entropy, would be consequently induced. The resulted disorder in epigenome, characterized by corrosion of epigenome entropy as reflected in chromatin template, can be stably memorized and propagated through cell divisions. Furthermore, hysteresis nature of epigenetics responding to emerging environment could exacerbate the degradation of epigenome entropy. Besides stochastic errors, we propose that epigenetics disorder and chaos derived from unordered environment and complex cell experiences play an essential role in epigenetic drift and the as-resulted cellular aging.


Sign in / Sign up

Export Citation Format

Share Document