PMCNS

2014 ◽  
Vol 4 (2) ◽  
pp. 1-19 ◽  
Author(s):  
Jorge Gomes ◽  
Paulo Urbano ◽  
Anders Lyhne Christensen

Novelty search is an evolutionary approach in which the population is driven towards behavioural innovation instead of towards a fixed objective. The use of behavioural novelty to score candidate solutions precludes convergence to local optima. However, in novelty search, significant effort may be spent on exploration of novel, but unfit behaviours. We propose progressive minimal criteria novelty search (PMCNS) to overcome this issue. In PMCNS, novelty search can freely explore the behaviour space as long as the solutions meet a progressively stricter fitness criterion. We evaluate the performance of our approach by evolving neurocontrollers for swarms of robots in two distinct tasks. Our results show that PMCNS outperforms fitness-based evolution and pure novelty search, and that PMCNS is superior to linear scalarisation of novelty and fitness scores. An analysis of behaviour space exploration shows that the benefits of novelty search are conserved in PMCNS despite the evolutionary pressure towards progressively fitter behaviours.

2016 ◽  
Vol 24 (3) ◽  
pp. 545-572 ◽  
Author(s):  
A. Nguyen ◽  
J. Yosinski ◽  
J. Clune

The Achilles Heel of stochastic optimization algorithms is getting trapped on local optima. Novelty Search mitigates this problem by encouraging exploration in all interesting directions by replacing the performance objective with a reward for novel behaviors. This reward for novel behaviors has traditionally required a human-crafted, behavioral distance function. While Novelty Search is a major conceptual breakthrough and outperforms traditional stochastic optimization on certain problems, it is not clear how to apply it to challenging, high-dimensional problems where specifying a useful behavioral distance function is difficult. For example, in the space of images, how do you encourage novelty to produce hawks and heroes instead of endless pixel static? Here we propose a new algorithm, the Innovation Engine, that builds on Novelty Search by replacing the human-crafted behavioral distance with a Deep Neural Network (DNN) that can recognize interesting differences between phenotypes. The key insight is that DNNs can recognize similarities and differences between phenotypes at an abstract level, wherein novelty means interesting novelty. For example, a DNN-based novelty search in the image space does not explore in the low-level pixel space, but instead creates a pressure to create new types of images (e.g., churches, mosques, obelisks, etc.). Here, we describe the long-term vision for the Innovation Engine algorithm, which involves many technical challenges that remain to be solved. We then implement a simplified version of the algorithm that enables us to explore some of the algorithm’s key motivations. Our initial results, in the domain of images, suggest that Innovation Engines could ultimately automate the production of endless streams of interesting solutions in any domain: for example, producing intelligent software, robot controllers, optimized physical components, and art.


2019 ◽  
Vol 42 ◽  
Author(s):  
Marco Del Giudice

Abstract The argument against innatism at the heart of Cognitive Gadgets is provocative but premature, and is vitiated by dichotomous thinking, interpretive double standards, and evidence cherry-picking. I illustrate my criticism by addressing the heritability of imitation and mindreading, the relevance of twin studies, and the meaning of cross-cultural differences in theory of mind development. Reaching an integrative understanding of genetic inheritance, plasticity, and learning is a formidable task that demands a more nuanced evolutionary approach.


2004 ◽  
Author(s):  
Brian Peacock ◽  
Jeffrey McCandless ◽  
Sudhakar Rajulu ◽  
Frances Mount ◽  
Melissa Mallis ◽  
...  

2008 ◽  
Author(s):  
R. Darin Ellis ◽  
Thomas G. Edwards ◽  
Lavie Golenberg ◽  
Abhilash Pandya

Sign in / Sign up

Export Citation Format

Share Document