Modified nonparametric kernel estimates of a regression function and their consistencies with rates

1987 ◽  
Vol 39 (3) ◽  
pp. 549-562
Author(s):  
Radhey S. Singh ◽  
Manzoor Ahmad
2012 ◽  
Vol 28 (5) ◽  
pp. 935-958 ◽  
Author(s):  
Degui Li ◽  
Zudi Lu ◽  
Oliver Linton

Local linear fitting is a popular nonparametric method in statistical and econometric modeling. Lu and Linton (2007, Econometric Theory23, 37–70) established the pointwise asymptotic distribution for the local linear estimator of a nonparametric regression function under the condition of near epoch dependence. In this paper, we further investigate the uniform consistency of this estimator. The uniform strong and weak consistencies with convergence rates for the local linear fitting are established under mild conditions. Furthermore, general results regarding uniform convergence rates for nonparametric kernel-based estimators are provided. The results of this paper will be of wide potential interest in time series semiparametric modeling.


2009 ◽  
Vol 25 (6) ◽  
pp. 1716-1733 ◽  
Author(s):  
P.M. Robinson

The central limit theorem for nonparametric kernel estimates of a smooth trend, with linearly generated errors, indicates asymptotic independence and homoskedasticity across fixed points, irrespective of whether disturbances have short memory, long memory, or antipersistence. However, the asymptotic variance depends on the kernel function in a way that varies across these three circumstances, and in the latter two it involves a double integral that cannot necessarily be evaluated in closed form. For a particular class of kernels, we obtain analytic formulas. We discuss extensions to more general settings, including ones involving possible cross-sectional or spatial dependence.


2010 ◽  
Vol 2010 ◽  
pp. 1-9 ◽  
Author(s):  
Xingfang Huang ◽  
Peihua Qiu

In many applications, observed signals are contaminated by both random noise and blur. This paper proposes a blind deconvolution procedure for estimating a regression function with possible jumps preserved, by removing both noise and blur when recovering the signals. Our procedure is based on three local linear kernel estimates of the regression function, constructed from observations in a left-side, a right-side, and a two-side neighborhood of a given point, respectively. The estimated function at the given point is then defined by one of the three estimates with the smallest weighted residual sum of squares. To better remove the noise and blur, this estimate can also be updated iteratively. Performance of this procedure is investigated by both simulation and real data examples, from which it can be seen that our procedure performs well in various cases.


Author(s):  
Jitka Poměnková

Kernel smoothers belong to the most popular nonparametric functional estimates. They provide a simple way of finding structure in data. Kernel smoothing can be very well applied on the regression model. In the context of kernel estimates of a regression function, the choice of a kernel from the different points o view can be investigated. The main idea of this paper is to present construction of the optimal kernel and edge optimal kernel by means of the Gegenbauer and Legendre polynomial.


2019 ◽  
Vol 46 (1) ◽  
pp. 53-83
Author(s):  
Amina Angelika Bouchentouf ◽  
Abbes Rabhi ◽  
Aboubacar Traore

Sensors ◽  
2019 ◽  
Vol 19 (23) ◽  
pp. 5092
Author(s):  
Matthieu Saumard ◽  
Marwa Elbouz ◽  
Michaël Aron ◽  
Ayman Alfalou ◽  
Christian Brosseau

Optical correlation has a rich history in image recognition applications from a database. In practice, it is simple to implement optically using two lenses or numerically using two Fourier transforms. Even if correlation is a reliable method for image recognition, it may jeopardize decision making according to the location, height, and shape of the correlation peak within the correlation plane. Additionally, correlation is very sensitive to image rotation and scale. To overcome these issues, in this study, we propose a method of nonparametric modelling of the correlation plane. Our method is based on a kernel estimation of the regression function used to classify the individual images in the correlation plane. The basic idea is to improve the decision by taking into consideration the energy shape and distribution in the correlation plane. The method relies on the calculation of the Hausdorff distance between the target correlation plane (of the image to recognize) and the correlation planes obtained from the database (the correlation planes computed from the database images). Our method is tested for a face recognition application using the Pointing Head Pose Image Database (PHPID) database. Overall, the results demonstrate good performances of this method compared to competitive methods in terms of good detection and very low false alarm rates.


Sign in / Sign up

Export Citation Format

Share Document