scholarly journals On the worst-case error of least squares algorithms for L2-approximation with high probability

2020 ◽  
Vol 60 ◽  
pp. 101484 ◽  
Author(s):  
Mario Ullrich
Author(s):  
Lutz Kämmerer ◽  
Tino Ullrich ◽  
Toni Volkmer

AbstractWe construct a least squares approximation method for the recovery of complex-valued functions from a reproducing kernel Hilbert space on $$D \subset \mathbb {R}^d$$ D ⊂ R d . The nodes are drawn at random for the whole class of functions, and the error is measured in $$L_2(D,\varrho _{D})$$ L 2 ( D , ϱ D ) . We prove worst-case recovery guarantees by explicitly controlling all the involved constants. This leads to new preasymptotic recovery bounds with high probability for the error of hyperbolic Fourier regression on multivariate data. In addition, we further investigate its counterpart hyperbolic wavelet regression also based on least squares to recover non-periodic functions from random samples. Finally, we reconsider the analysis of a cubature method based on plain random points with optimal weights and reveal near-optimal worst-case error bounds with high probability. It turns out that this simple method can compete with the quasi-Monte Carlo methods in the literature which are based on lattices and digital nets.


1994 ◽  
Vol 27 (8) ◽  
pp. 395-400
Author(s):  
Hüseyin Akçay ◽  
Håkan Hjalmarson
Keyword(s):  

1998 ◽  
Vol 33 (1) ◽  
pp. 19-24 ◽  
Author(s):  
Hüseyin Akçay ◽  
Brett Ninness

1987 ◽  
Vol 41 (8) ◽  
pp. 1324-1329 ◽  
Author(s):  
Charles K. Mann ◽  
Thomas J. Vickers ◽  
James D. Womack

The problems encountered in applying Raman spectroscopy to direct qualitative and quantitative analysis for minor impurities in nominally pure, colorless solids have been examined. Samples of sulfamethoxazole spiked with 0.5 to 5% of sulfanilamide and sulfanilic acid were used as test materials. A procedure is described which permits detection of spectral features of the specified impurities at the 0.5% level. Least-squares fitting and cross-correlation data treatment procedures for the determination of sulfanilamide in sulfamethoxazole, with limits of detection of about 0.1% for either approach, are described. Computer simulations have been used to examine detection of impurity peaks for a variety of conditions, including the worst-case scenario in which the impurity features coincide with the strongest features of the spectrum of the host material. A least-squares fitting approach is described which permits detection of the impurity peak at the 0.5% level, even under worst case conditions.


Author(s):  
Robert Kleinberg ◽  
Kevin Leyton-Brown ◽  
Brendan Lucier

Algorithm configuration methods have achieved much practical success, but to date have not been backed by meaningful performance guarantees. We address this gap with a new algorithm configuration framework, Structured Procrastination. With high probability and nearly as quickly as possible in the worst case, our framework finds an algorithm configuration that provably achieves near optimal performance. Moreover, its running time requirements asymptotically dominate those of existing methods.


Author(s):  
Mikhail V. Berlinkov ◽  
Cyril Nicaud

In this paper we address the question of synchronizing random automata in the critical settings of almost-group automata. Group automata are automata where all letters act as permutations on the set of states, and they are not synchronizing (unless they have one state). In almost-group automata, one of the letters acts as a permutation on [Formula: see text] states, and the others as permutations. We prove that this small change is enough for automata to become synchronizing with high probability. More precisely, we establish that the probability that a strongly-connected almost-group automaton is not synchronizing is [Formula: see text], for a [Formula: see text]-letter alphabet. We also present an efficient algorithm that decides whether a strongly-connected almost-group automaton is synchronizing. For a natural model of computation, we establish a [Formula: see text] worst-case lower bound for this problem ([Formula: see text] for the average case), which is almost matched by our algorithm.


2020 ◽  
Vol 2020 (28) ◽  
pp. 264-269
Author(s):  
Yi-Tun Lin ◽  
Graham D. Finlayson

Spectral reconstruction (SR) algorithms attempt to map RGB- to hyperspectral-images. Classically, simple pixel-based regression is used to solve for this SR mapping and more recently patch-based Deep Neural Networks (DNN) are considered (with a modest performance increment). For either method, the 'training' process typically minimizes a Mean-Squared-Error (MSE) loss. Curiously, in recent research, SR algorithms are evaluated and ranked based on a relative percentage error, so-called MeanRelative-Absolute Error (MRAE), which behaves very differently from the MSE loss function. The most recent DNN approaches - perhaps unsurprisingly - directly optimize for this new MRAE error in training so as to match this new evaluation criteria.<br/> In this paper, we show how we can also reformulate pixelbased regression methods so that they too optimize a relative spectral error. Our Relative Error Least-Squares (RELS) approach minimizes an error that is similar to MRAE. Experiments demonstrate that regression models based on RELS deliver better spectral recovery, with up to a 10% increment in mean performance and a 20% improvement in worst-case performance depending on the method.


Sign in / Sign up

Export Citation Format

Share Document