scattered data approximation
Recently Published Documents


TOTAL DOCUMENTS

37
(FIVE YEARS 3)

H-INDEX

11
(FIVE YEARS 1)

2021 ◽  
Vol 31 (3) ◽  
Author(s):  
Filip Tronarp ◽  
Simo Särkkä ◽  
Philipp Hennig

AbstractThere is a growing interest in probabilistic numerical solutions to ordinary differential equations. In this paper, the maximum a posteriori estimate is studied under the class of $$\nu $$ ν times differentiable linear time-invariant Gauss–Markov priors, which can be computed with an iterated extended Kalman smoother. The maximum a posteriori estimate corresponds to an optimal interpolant in the reproducing kernel Hilbert space associated with the prior, which in the present case is equivalent to a Sobolev space of smoothness $$\nu +1$$ ν + 1 . Subject to mild conditions on the vector field, convergence rates of the maximum a posteriori estimate are then obtained via methods from nonlinear analysis and scattered data approximation. These results closely resemble classical convergence results in the sense that a $$\nu $$ ν times differentiable prior process obtains a global order of $$\nu $$ ν , which is demonstrated in numerical examples.


Mathematics ◽  
2020 ◽  
Vol 8 (11) ◽  
pp. 1923
Author(s):  
Sanpeng Zheng ◽  
Renzhong Feng ◽  
Aitong Huang

The radial basis function (RBF) is a class of approximation functions commonly used in interpolation and least squares. The RBF is especially suitable for scattered data approximation and high dimensional function approximation. The smoothness and approximation accuracy of the RBF are affected by its shape parameter. There has been some research on the shape parameter, but the research on the optimal shape parameter of the least squares based on the RBF is scarce. This paper proposes a way for the measurement of the optimal shape parameter of the least squares approximation based on the RBF and an algorithm to solve the corresponding optimal parameter. The method consists of considering the shape parameter as an optimization variable of the least squares problem, such that the linear least squares problem becomes nonlinear. A dimensionality reduction is applied to the nonlinear least squares problem in order to simplify the objective function. To solve the optimization problem efficiently after the dimensional reduction, the derivative-free optimization is adopted. The numerical experiments indicate that the proposed method is efficient and reliable. Multiple kinds of RBFs are tested for their effects and compared. It is found through the experiments that the RBF least squares with the optimal shape parameter is much better than the polynomial least squares. The method is successfully applied to the fitting of real data.


2018 ◽  
Vol 8 (3) ◽  
pp. 407-443 ◽  
Author(s):  
Axel Flinth ◽  
Pierre Weiss

Abstract We study the solutions of infinite dimensional inverse problems over Banach spaces. The regularizer is defined as the total variation of a linear mapping of the function to recover, while the data fitting term is a near arbitrary function. The first contribution describes the solution’s structure: we show that under mild assumptions, there always exists an $m$-sparse solution, where $m$ is the number of linear measurements of the signal. Our second contribution is about the computation of the solution. While most existing works first discretize the problem, we show that exact solutions of the infinite dimensional problem can be obtained by solving one or two consecutive finite dimensional convex programs depending on the measurement functions structures. We finish by showing an application on scattered data approximation. These results extend recent advances in the understanding of total-variation regularized inverse problems.


Sadhana ◽  
2018 ◽  
Vol 43 (1) ◽  
Author(s):  
Bibin Francis ◽  
Sanjay Viswanath ◽  
Muthuvel Arigovindan

2015 ◽  
Vol 266 ◽  
pp. 893-902 ◽  
Author(s):  
Grand Roman Joldes ◽  
Habibullah Amin Chowdhury ◽  
Adam Wittek ◽  
Barry Doyle ◽  
Karol Miller

Sign in / Sign up

Export Citation Format

Share Document