NONPARAMETRIC COINTEGRATING REGRESSION WITH NNH ERRORS

2012 ◽  
Vol 29 (1) ◽  
pp. 1-27 ◽  
Author(s):  
Qiying Wang ◽  
Ying Xiang Rachel Wang

This paper studies a nonlinear cointegrating regression model with nonlinear nonstationary heteroskedastic error processes. We establish uniform consistency for the conventional kernel estimate of the unknown regression function and develop atwo-stage approach for the estimation of the heterogeneity generating function.

2016 ◽  
Vol 5 (2) ◽  
pp. 29
Author(s):  
Mounir ARFI

We give the rate of the uniform convergence for the kernel estimate of the regression function over a sequence of compact sets which increases to $\mathbb{R}^{d}$ when $n$ approaches the infinity and when the observed process is $\varphi$-mixing. The used estimator for the regression function is the kernel estimator proposed by Nadaraya, Watson (1964).


2018 ◽  
Vol 15 (2) ◽  
pp. 20 ◽  
Author(s):  
Budi Lestari

Abstract Regression model of bi-respond nonparametric is a regression model which is illustrating of the connection pattern between respond variable and one or more predictor variables, where between first respond and second respond have correlation each other. In this paper, we discuss the estimating functions of regression in regression model of bi-respond nonparametric by using different two estimation techniques, namely, smoothing spline and kernel. This study showed that for using smoothing spline and kernel, the estimator function of regression which has been obtained in observation is a regression linier. In addition, both estimators that are obtained from those two techniques are systematically only different on smoothing matrices. Keywords: kernel estimator, smoothing spline estimator, regression function, bi-respond nonparametric regression model. AbstrakModel regresi nonparametrik birespon adalah suatu model regresi yang menggambarkan pola hubungan antara dua variabel respon dan satu atau beberapa variabel prediktor dimana antara respon pertama dan respon kedua berkorelasi. Dalam makalah ini dibahas estimasi fungsi regresi dalam  model regresi nonparametrik birespon menggunakan dua teknik estimasi yang berbeda, yaitu smoothing spline dan kernel. Hasil studi ini menunjukkan bahwa, baik menggunakan smoothing spline maupun menggunakan kernel, estimator fungsi regresi yang didapatkan merupakan fungsi linier dalam observasi. Selain itu, kedua estimator fungsi regresi yang didapatkan dari kedua teknik estimasi tersebut secara matematis hanya dibedakan oleh matriks penghalusnya.Kata Kunci : Estimator Kernel, Estimator Smoothing Spline, Fungsi Regresi, Model Regresi Nonparametrik Birespon.


2012 ◽  
Vol 28 (5) ◽  
pp. 935-958 ◽  
Author(s):  
Degui Li ◽  
Zudi Lu ◽  
Oliver Linton

Local linear fitting is a popular nonparametric method in statistical and econometric modeling. Lu and Linton (2007, Econometric Theory23, 37–70) established the pointwise asymptotic distribution for the local linear estimator of a nonparametric regression function under the condition of near epoch dependence. In this paper, we further investigate the uniform consistency of this estimator. The uniform strong and weak consistencies with convergence rates for the local linear fitting are established under mild conditions. Furthermore, general results regarding uniform convergence rates for nonparametric kernel-based estimators are provided. The results of this paper will be of wide potential interest in time series semiparametric modeling.


2014 ◽  
Vol 2014 ◽  
pp. 1-7
Author(s):  
Christophe Chesneau

We investigate the estimation of a multiplicative separable regression function from a bidimensional nonparametric regression model with random design. We present a general estimator for this problem and study its mean integrated squared error (MISE) properties. A wavelet version of this estimator is developed. In some situations, we prove that it attains the standard unidimensional rate of convergence under the MISE over Besov balls.


Author(s):  
Loc Nguyen

Expectation maximization (EM) algorithm is a powerful mathematical tool for estimating statistical parameter when data sample contains hidden part and observed part. EM is applied to learn finite mixture model in which the whole distribution of observed variable is average sum of partial distributions. Coverage ratio of every partial distribution is specified by the probability of hidden variable. An application of mixture model is soft clustering in which cluster is modeled by hidden variable whereas each data point can be assigned to more than one cluster and degree of such assignment is represented by the probability of hidden variable. However, such probability in traditional mixture model is simplified as a parameter, which can cause loss of valuable information. Therefore, in this research I propose a so-called conditional mixture model (CMM) in which the probability of hidden variable is modeled as a full probabilistic density function (PDF) that owns individual parameter. CMM aims to extend mixture model. I also propose an application of CMM which is called adaptive regressive model (ARM). Traditional regression model is effective when data sample is scattered equally. If data points are grouped into clusters, regression model tries to learn a unified regression function which goes through all data points. Obviously, such unified function is not effective to evaluate response variable based on grouped data points. The concept “adaptive” of ARM means that ARM solves the ineffectiveness problem by selecting the best cluster of data points firstly and then evaluating response variable within such best cluster. In order words, ARM reduces estimation space of regression model so as to gain high accuracy in calculation.


2021 ◽  
Vol 19 (1) ◽  
pp. 1056-1068
Author(s):  
Yingxia Chen

Abstract In this paper, we consider the regression model with fixed design: Y i = g ( x i ) + ε i {Y}_{i}=g\left({x}_{i})+{\varepsilon }_{i} , 1 ≤ i ≤ n 1\le i\le n , where { x i } \left\{{x}_{i}\right\} are the nonrandom design points, and { ε i } \left\{{\varepsilon }_{i}\right\} is a sequence of martingale, and g g is an unknown function. Nonparametric estimator g n ( x ) {g}_{n}\left(x) of g ( x ) g\left(x) will be introduced and its strong convergence properties are established.


2001 ◽  
Vol 26 (4) ◽  
pp. 443-468 ◽  
Author(s):  
Yeow Meng Thum ◽  
Suman K. Bhattacharya

A substantial literature on switches in linear regression functions considers situations in which the regression function is discontinuous at an unknown value of the regressor, Xk , where k is the so-called unknown “change point.” The regression model is thus a two-phase composite of yi ∼ N(β01 + β11xi, σ12), i=1, 2,..., k and yi ∼ N(β02 + β12xi, σ22), i= k + 1, k + 2,..., n. Solutions to this single series problem are considerably more complex when we consider a wrinkle frequently encountered in evaluation studies of system interventions, in that a system typically comprises multiple members (j = 1, 2, . . . , m ) and that members of the system cannot all be expected to change synchronously. For example, schools differ not only in whether a program, implemented system-wide, improves their students’ test scores, but depending on the resources already in place, schools may also differ in when they start to show effects of the program. If ignored, heterogeneity among schools in when the program takes initial effect undermines any program evaluation that assumes that change points are known and that they are the same for all schools. To describe individual behavior within a system better, and using a sample of longitudinal test scores from a large urban school system, we consider hierarchical Bayes estimation of a multilevel linear regression model in which each individual regression slope of test score on time switches at some unknown point in time, kj. We further explore additional results employing models that accommodate case weights and shorter time series.


Author(s):  
Jitka Poměnková

Kernel smoothers belong to the most popular nonparametric functional estimates. They provide a simple way of finding structure in data. Kernel smoothing can be very well applied on the regression model. In the context of kernel estimates of a regression function, the choice of a kernel from the different points o view can be investigated. The main idea of this paper is to present construction of the optimal kernel and edge optimal kernel by means of the Gegenbauer and Legendre polynomial.


2020 ◽  
Vol 12 (6) ◽  
pp. 74
Author(s):  
Kouame Florent Kouakou ◽  
Armel Fabrice Evrard Yode

We study the problem of multivariate estimation in the nonparametric regression model with random design. We assume that the regression function to be estimated possesses partially linear structure, where parametric and nonparametric components are both unknown. Based on Goldenshulger and Lepski methodology, we propose estimation procedure that adapts to the smoothness of the nonparametric component, by selecting from a family of specific kernel estimators. We establish a global oracle inequality (under the Lp-norm, 1≤p<1) and examine its performance over the anisotropic H¨older space.


Sign in / Sign up

Export Citation Format

Share Document