Constructive Approximation
Latest Publications


TOTAL DOCUMENTS

1275
(FIVE YEARS 105)

H-INDEX

50
(FIVE YEARS 2)

Published By Springer-Verlag

1432-0940, 0176-4276

Author(s):  
B. Eichinger ◽  
P. Yuditskii

AbstractThe standard well-known Remez inequality gives an upper estimate of the values of polynomials on $$[-1,1]$$ [ - 1 , 1 ] if they are bounded by 1 on a subset of $$[-1,1]$$ [ - 1 , 1 ] of fixed Lebesgue measure. The extremal solution is given by the rescaled Chebyshev polynomials for one interval. Andrievskii asked about the maximal value of polynomials at a fixed point, if they are again bounded by 1 on a set of fixed size. We show that the extremal polynomials are either Chebyshev (one interval) or Akhiezer polynomials (two intervals) and prove Totik–Widom bounds for the extremal value, thereby providing a complete asymptotic solution to the Andrievskii problem.


Author(s):  
Javad Mashreghi ◽  
Pierre-Olivier Parisé ◽  
Thomas Ransford

Author(s):  
Adrian Ebert ◽  
Peter Kritzer ◽  
Onyekachi Osisiogu ◽  
Tetiana Stepaniuk
Keyword(s):  

Author(s):  
Lutz Kämmerer ◽  
Tino Ullrich ◽  
Toni Volkmer

AbstractWe construct a least squares approximation method for the recovery of complex-valued functions from a reproducing kernel Hilbert space on $$D \subset \mathbb {R}^d$$ D ⊂ R d . The nodes are drawn at random for the whole class of functions, and the error is measured in $$L_2(D,\varrho _{D})$$ L 2 ( D , ϱ D ) . We prove worst-case recovery guarantees by explicitly controlling all the involved constants. This leads to new preasymptotic recovery bounds with high probability for the error of hyperbolic Fourier regression on multivariate data. In addition, we further investigate its counterpart hyperbolic wavelet regression also based on least squares to recover non-periodic functions from random samples. Finally, we reconsider the analysis of a cubature method based on plain random points with optimal weights and reveal near-optimal worst-case error bounds with high probability. It turns out that this simple method can compete with the quasi-Monte Carlo methods in the literature which are based on lattices and digital nets.


Author(s):  
Wolfgang Dahmen ◽  
Ronald A. DeVore ◽  
Philipp Grohs
Keyword(s):  

Author(s):  
Stefan Kahler

AbstractIn the theory of orthogonal polynomials, as well as in its intersection with harmonic analysis, it is an important problem to decide whether a given orthogonal polynomial sequence $$(P_n(x))_{n\in \mathbb {N}_0}$$ ( P n ( x ) ) n ∈ N 0 satisfies nonnegative linearization of products, i.e., the product of any two $$P_m(x),P_n(x)$$ P m ( x ) , P n ( x ) is a conical combination of the polynomials $$P_{|m-n|}(x),\ldots ,P_{m+n}(x)$$ P | m - n | ( x ) , … , P m + n ( x ) . Since the coefficients in the arising expansions are often of cumbersome structure or not explicitly available, such considerations are generally very nontrivial. Gasper (Can J Math 22:582–593, 1970) was able to determine the set V of all pairs $$(\alpha ,\beta )\in (-1,\infty )^2$$ ( α , β ) ∈ ( - 1 , ∞ ) 2 for which the corresponding Jacobi polynomials $$(R_n^{(\alpha ,\beta )}(x))_{n\in \mathbb {N}_0}$$ ( R n ( α , β ) ( x ) ) n ∈ N 0 , normalized by $$R_n^{(\alpha ,\beta )}(1)\equiv 1$$ R n ( α , β ) ( 1 ) ≡ 1 , satisfy nonnegative linearization of products. Szwarc (Inzell Lectures on Orthogonal Polynomials, Adv. Theory Spec. Funct. Orthogonal Polynomials, vol 2, Nova Sci. Publ., Hauppauge, NY pp 103–139, 2005) asked to solve the analogous problem for the generalized Chebyshev polynomials $$(T_n^{(\alpha ,\beta )}(x))_{n\in \mathbb {N}_0}$$ ( T n ( α , β ) ( x ) ) n ∈ N 0 , which are the quadratic transformations of the Jacobi polynomials and orthogonal w.r.t. the measure $$(1-x^2)^{\alpha }|x|^{2\beta +1}\chi _{(-1,1)}(x)\,\mathrm {d}x$$ ( 1 - x 2 ) α | x | 2 β + 1 χ ( - 1 , 1 ) ( x ) d x . In this paper, we give the solution and show that $$(T_n^{(\alpha ,\beta )}(x))_{n\in \mathbb {N}_0}$$ ( T n ( α , β ) ( x ) ) n ∈ N 0 satisfies nonnegative linearization of products if and only if $$(\alpha ,\beta )\in V$$ ( α , β ) ∈ V , so the generalized Chebyshev polynomials share this property with the Jacobi polynomials. Moreover, we reconsider the Jacobi polynomials themselves, simplify Gasper’s original proof and characterize strict positivity of the linearization coefficients. Our results can also be regarded as sharpenings of Gasper’s one.


Sign in / Sign up

Export Citation Format

Share Document