scholarly journals Universal approximation theorem for uninorm-based fuzzy systems modeling

2003 ◽  
Vol 140 (2) ◽  
pp. 331-339 ◽  
Author(s):  
Ronald R. Yager ◽  
Vladik Kreinovich
2016 ◽  
Vol 28 (12) ◽  
pp. 2585-2593 ◽  
Author(s):  
Hien D. Nguyen ◽  
Luke R. Lloyd-Jones ◽  
Geoffrey J. McLachlan

The mixture-of-experts (MoE) model is a popular neural network architecture for nonlinear regression and classification. The class of MoE mean functions is known to be uniformly convergent to any unknown target function, assuming that the target function is from a Sobolev space that is sufficiently differentiable and that the domain of estimation is a compact unit hypercube. We provide an alternative result, which shows that the class of MoE mean functions is dense in the class of all continuous functions over arbitrary compact domains of estimation. Our result can be viewed as a universal approximation theorem for MoE models. The theorem we present allows MoE users to be confident in applying such models for estimation when data arise from nonlinear and nondifferentiable generative processes.


Author(s):  
Young Hoon Joo ◽  
Guanrong Chen

The basic objective of system modeling is to establish an input-output representative mapping that can satisfactorily describe the system behaviors, by using the available input-output data based upon physical or empirical knowledge about the structure of the unknown system.


Sign in / Sign up

Export Citation Format

Share Document