A new classifier combination method based on TSK-type fuzzy system

Author(s):  
Liu Ming ◽  
Yuan Bao-zong ◽  
Feng Ze-shu ◽  
Chen Jiang-feng
Author(s):  
Lei Xu ◽  
Shun-ichi Amari

Expert combination is a classic strategy that has been widely used in various problem solving tasks. A team of individuals with diverse and complementary skills tackle a task jointly such that a performance better than any single individual can make is achieved via integrating the strengths of individuals. Started from the late 1980’ in the handwritten character recognition literature, studies have been made on combining multiple classifiers. Also from the early 1990’ in the fields of neural networks and machine learning, efforts have been made under the name of ensemble learning or mixture of experts on how to learn jointly a mixture of experts (parametric models) and a combining strategy for integrating them in an optimal sense. The article aims at a general sketch of two streams of studies, not only with a re-elaboration of essential tasks, basic ingredients, and typical combining rules, but also with a general combination framework (especially one concise and more useful one-parameter modulated special case, called a-integration) suggested to unify a number of typical classifier combination rules and several mixture based learning models, as well as max rule and min rule used in the literature on fuzzy system.


2012 ◽  
pp. 243-252
Author(s):  
Lei Xu ◽  
Shun-ichi Amari

Expert combination is a classic strategy that has been widely used in various problem solving tasks. A team of individuals with diverse and complementary skills tackle a task jointly such that a performance better than any single individual can make is achieved via integrating the strengths of individuals. Started from the late 1980’ in the handwritten character recognition literature, studies have been made on combining multiple classifiers. Also from the early 1990’ in the fields of neural networks and machine learning, efforts have been made under the name of ensemble learning or mixture of experts on how to learn jointly a mixture of experts (parametric models) and a combining strategy for integrating them in an optimal sense. The article aims at a general sketch of two streams of studies, not only with a re-elaboration of essential tasks, basic ingredients, and typical combining rules, but also with a general combination framework (especially one concise and more useful one-parameter modulated special case, called a-integration) suggested to unify a number of typical classifier combination rules and several mixture based learning models, as well as max rule and min rule used in the literature on fuzzy system.


2006 ◽  
Vol 68 (3) ◽  
pp. 274-279 ◽  
Author(s):  
Akira TAKAHASHI ◽  
Naoya YAMAZAKI ◽  
Akifumi YAMAMOTO ◽  
Kouji YOSHINO ◽  
Kenjiro NAMIKAWA ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document