scholarly journals Evolving graphs with semantic neutral drift

2019 ◽  
Author(s):  
Timothy Atkinson ◽  
Detlef Plump ◽  
Susan Stepney

Abstract We introduce the concept of Semantic Neutral Drift (SND) for genetic programming (GP), where we exploit equivalence laws to design semantics preserving mutations guaranteed to preserve individuals’ fitness scores. A number of digital circuit benchmark problems have been implemented with rule-based graph programs and empirically evaluated, demonstrating quantitative improvements in evolutionary performance. Analysis reveals that the benefits of the designed SND reside in more complex processes than simple growth of individuals, and that there are circumstances where it is beneficial to choose otherwise detrimental parameters for a GP system if that facilitates the inclusion of SND.

Author(s):  
Léo Françoso Dal Piccol Sotto ◽  
Paul Kaufmann ◽  
Timothy Atkinson ◽  
Roman Kalkreuth ◽  
Márcio Porto Basgalupp

AbstractGraph representations promise several desirable properties for genetic programming (GP); multiple-output programs, natural representations of code reuse and, in many cases, an innate mechanism for neutral drift. Each graph GP technique provides a program representation, genetic operators and overarching evolutionary algorithm. This makes it difficult to identify the individual causes of empirical differences, both between these methods and in comparison to traditional GP. In this work, we empirically study the behaviour of Cartesian genetic programming (CGP), linear genetic programming (LGP), evolving graphs by graph programming and traditional GP. By fixing some aspects of the configurations, we study the performance of each graph GP method and GP in combination with three different EAs: generational, steady-state and $$(1+\lambda )$$ ( 1 + λ ) . In general, we find that the best choice of representation, genetic operator and evolutionary algorithm depends on the problem domain. Further, we find that graph GP methods can increase search performance on complex real-world regression problems and, particularly in combination with the ($$1 + \lambda$$ 1 + λ ) EA, are significantly better on digital circuit synthesis tasks. We further show that the reuse of intermediate results by tuning LGP’s number of registers and CGP’s levels back parameter is of utmost importance and contributes significantly to better convergence of an optimization algorithm when solving complex problems that benefit from code reuse.


2019 ◽  
Vol 21 (3) ◽  
pp. 471-501 ◽  
Author(s):  
Michael Kommenda ◽  
Bogdan Burlacu ◽  
Gabriel Kronberger ◽  
Michael Affenzeller

AbstractIn this paper we analyze the effects of using nonlinear least squares for parameter identification of symbolic regression models and integrate it as local search mechanism in tree-based genetic programming. We employ the Levenberg–Marquardt algorithm for parameter optimization and calculate gradients via automatic differentiation. We provide examples where the parameter identification succeeds and fails and highlight its computational overhead. Using an extensive suite of symbolic regression benchmark problems we demonstrate the increased performance when incorporating nonlinear least squares within genetic programming. Our results are compared with recently published results obtained by several genetic programming variants and state of the art machine learning algorithms. Genetic programming with nonlinear least squares performs among the best on the defined benchmark suite and the local search can be easily integrated in different genetic programming algorithms as long as only differentiable functions are used within the models.


Sign in / Sign up

Export Citation Format

Share Document