scholarly journals New Insights Into Learning With Correntropy-Based Regression

2021 ◽  
Vol 33 (1) ◽  
pp. 157-173
Author(s):  
Yunlong Feng

Stemming from information-theoretic learning, the correntropy criterion and its applications to machine learning tasks have been extensively studied and explored. Its application to regression problems leads to the robustness-enhanced regression paradigm: correntropy-based regression. Having drawn a great variety of successful real-world applications, its theoretical properties have also been investigated recently in a series of studies from a statistical learning viewpoint. The resulting big picture is that correntropy-based regression regresses toward the conditional mode function or the conditional mean function robustly under certain conditions. Continuing this trend and going further, in this study, we report some new insights into this problem. First, we show that under the additive noise regression model, such a regression paradigm can be deduced from minimum distance estimation, implying that the resulting estimator is essentially a minimum distance estimator and thus possesses robustness properties. Second, we show that the regression paradigm in fact provides a unified approach to regression problems in that it approaches the conditional mean, the conditional mode, and the conditional median functions under certain conditions. Third, we present some new results when it is used to learn the conditional mean function by developing its error bounds and exponential convergence rates under conditional ([Formula: see text])-moment assumptions. The saturation effect on the established convergence rates, which was observed under ([Formula: see text])-moment assumptions, still occurs, indicating the inherent bias of the regression estimator. These novel insights deepen our understanding of correntropy-based regression, help cement the theoretic correntropy framework, and enable us to investigate learning schemes induced by general bounded nonconvex loss functions.

2021 ◽  
Author(s):  
Likai Chen ◽  
Ekaterina Smetanina ◽  
Wei Biao Wu

Abstract This paper presents a multiplicative nonstationary nonparametric regression model which allows for a broad class of nonstationary processes. We propose a three-step estimation procedure to uncover the conditional mean function and establish uniform convergence rates and asymptotic normality of our estimators. The new model can also be seen as a dimension-reduction technique for a general two-dimensional time-varying nonparametric regression model, which is especially useful in small samples and for estimating explicitly multiplicative structural models. We consider two applications: estimating a pricing equation for the US aggregate economy to model consumption growth, and estimating the shape of the monthly risk premium for S&P 500 Index data.


2020 ◽  
pp. 1-33
Author(s):  
Abdelhakim Aknouche ◽  
Christian Francq

We consider a positive-valued time series whose conditional distribution has a time-varying mean, which may depend on exogenous variables. The main applications concern count or duration data. Under a contraction condition on the mean function, it is shown that stationarity and ergodicity hold when the mean and stochastic orders of the conditional distribution are the same. The latter condition holds for the exponential family parametrized by the mean, but also for many other distributions. We also provide conditions for the existence of marginal moments and for the geometric decay of the beta-mixing coefficients. We give conditions for consistency and asymptotic normality of the Exponential Quasi-Maximum Likelihood Estimator of the conditional mean parameters. Simulation experiments and illustrations on series of stock market volumes and of greenhouse gas concentrations show that the multiplicative-error form of usual duration models deserves to be relaxed, as allowed in this paper.


Sign in / Sign up

Export Citation Format

Share Document