A neural learning classifier system with self-adaptive constructivism

Author(s):  
L. Bull ◽  
J. Hurst
2010 ◽  
Vol 19 (01) ◽  
pp. 275-296 ◽  
Author(s):  
OLGIERD UNOLD

This article introduces a new kind of self-adaptation in discovery mechanism of learning classifier system XCS. Unlike the previous approaches, which incorporate self-adaptive parameters in the representation of an individual, proposed model evolves competitive population of the reduced XCSs, which are able to adapt both classifiers and genetic parameters. The experimental comparisons of self-adaptive mutation rate XCS and standard XCS interacting with 11-bit, 20-bit, and 37-bit multiplexer environment were provided. It has been shown that adapting the mutation rate can give an equivalent or better performance to known good fixed parameter settings, especially for computationally complex tasks. Moreover, the self-adaptive XCS is able to solve the problem of inappropriate for a standard XCS parameters.


Author(s):  
Maciej Troć ◽  
Olgierd Unold

Self-adaptation of parameters in a learning classifier system ensemble machineSelf-adaptation is a key feature of evolutionary algorithms (EAs). Although EAs have been used successfully to solve a wide variety of problems, the performance of this technique depends heavily on the selection of the EA parameters. Moreover, the process of setting such parameters is considered a time-consuming task. Several research works have tried to deal with this problem; however, the construction of algorithms letting the parameters adapt themselves to the problem is a critical and open problem of EAs. This work proposes a novel ensemble machine learning method that is able to learn rules, solve problems in a parallel way and adapt parameters used by its components. A self-adaptive ensemble machine consists of simultaneously working extended classifier systems (XCSs). The proposed ensemble machine may be treated as a meta classifier system. A new self-adaptive XCS-based ensemble machine was compared with two other XCS-based ensembles in relation to one-step binary problems: Multiplexer, One Counts, Hidden Parity, and randomly generated Boolean functions, in a noisy version as well. Results of the experiments have shown the ability of the model to adapt the mutation rate and the tournament size. The results are analyzed in detail.


2006 ◽  
Vol 12 (3) ◽  
pp. 353-380 ◽  
Author(s):  
Jacob Hurst ◽  
Larry Bull

For artificial entities to achieve true autonomy and display complex lifelike behavior, they will need to exploit appropriate adaptable learning algorithms. In this context adaptability implies flexibility guided by the environment at any given time and an open-ended ability to learn appropriate behaviors. This article examines the use of constructivism-inspired mechanisms within a neural learning classifier system architecture that exploits parameter self-adaptation as an approach to realize such behavior. The system uses a rule structure in which each rule is represented by an artificial neural network. It is shown that appropriate internal rule complexity emerges during learning at a rate controlled by the learner and that the structure indicates underlying features of the task. Results are presented in simulated mazes before moving to a mobile robot platform.


Sign in / Sign up

Export Citation Format

Share Document