rigorous error bounds
Recently Published Documents


TOTAL DOCUMENTS

13
(FIVE YEARS 3)

H-INDEX

5
(FIVE YEARS 1)

2021 ◽  
Vol 5 (1) ◽  
pp. 380-386
Author(s):  
Richard P. Brent ◽  

We show that a well-known asymptotic series for the logarithm of the central binomial coefficient is strictly enveloping in the sense of Pólya and Szegö, so the error incurred in truncating the series is of the same sign as the next term, and is bounded in magnitude by that term. We consider closely related asymptotic series for Binet's function, for \(\ln\Gamma(z+\frac12)\), and for the Riemann-Siegel theta function, and make some historical remarks.


Quantum ◽  
2020 ◽  
Vol 4 ◽  
pp. 227 ◽  
Author(s):  
Evgeny Mozgunov ◽  
Daniel Lidar

Markovian master equations are a ubiquitous tool in the study of open quantum systems, but deriving them from first principles involves a series of compromises. On the one hand, the Redfield equation is valid for fast environments (whose correlation function decays much faster than the system relaxation time) regardless of the relative strength of the coupling to the system Hamiltonian, but is notoriously non-completely-positive. On the other hand, the Davies equation preserves complete positivity but is valid only in the ultra-weak coupling limit and for systems with a finite level spacing, which makes it incompatible with arbitrarily fast time-dependent driving. Here we show that a recently derived Markovian coarse-grained master equation (CGME), already known to be completely positive, has a much expanded range of applicability compared to the Davies equation, and moreover, is locally generated and can be generalized to accommodate arbitrarily fast driving. This generalization, which we refer to as the time-dependent CGME, is thus suitable for the analysis of fast operations in gate-model quantum computing, such as quantum error correction and dynamical decoupling. Our derivation proceeds directly from the Redfield equation and allows us to place rigorous error bounds on all three equations: Redfield, Davies, and coarse-grained. Our main result is thus a completely positive Markovian master equation that is a controlled approximation to the true evolution for any time-dependence of the system Hamiltonian, and works for systems with arbitrarily small level spacing. We illustrate this with an analysis showing that dynamical decoupling can extend coherence times even in a strictly Markovian setting.


Entropy ◽  
2019 ◽  
Vol 21 (7) ◽  
pp. 644
Author(s):  
Baobin Wang ◽  
Ting Hu

In the framework of statistical learning, we study the online gradient descent algorithm generated by the correntropy-induced losses in Reproducing kernel Hilbert spaces (RKHS). As a generalized correlation measurement, correntropy has been widely applied in practice, owing to its prominent merits on robustness. Although the online gradient descent method is an efficient way to deal with the maximum correntropy criterion (MCC) in non-parameter estimation, there has been no consistency in analysis or rigorous error bounds. We provide a theoretical understanding of the online algorithm for MCC, and show that, with a suitable chosen scaling parameter, its convergence rate can be min–max optimal (up to a logarithmic factor) in the regression analysis. Our results show that the scaling parameter plays an essential role in both robustness and consistency.


2017 ◽  
Vol 44 (2) ◽  
pp. 1-27 ◽  
Author(s):  
Mioara Joldes ◽  
Jean-Michel Muller ◽  
Valentina Popescu

Author(s):  
Simon H. Tindemans ◽  
Goran Strbac

Data-driven risk analysis involves the inference of probability distributions from measured or simulated data. In the case of a highly reliable system, such as the electricity grid, the amount of relevant data is often exceedingly limited, but the impact of estimation errors may be very large. This paper presents a robust non-parametric Bayesian method to infer possible underlying distributions. The method obtains rigorous error bounds even for small samples taken from ill-behaved distributions. The approach taken has a natural interpretation in terms of the intervals between ordered observations, where allocation of probability mass across intervals is well specified, but the location of that mass within each interval is unconstrained. This formulation gives rise to a straightforward computational resampling method: Bayesian interval sampling. In a comparison with common alternative approaches, it is shown to satisfy strict error bounds even for ill-behaved distributions. This article is part of the themed issue ‘Energy management: flexibility, risk and optimization’.


2008 ◽  
Vol 46 (1) ◽  
pp. 180-200 ◽  
Author(s):  
Christian Jansson ◽  
Denis Chaykin ◽  
Christian Keil

Sign in / Sign up

Export Citation Format

Share Document