neat-GP is a bloat-free GP search, inspired by the insights of Flat-OE and built around the basic features of NEAT.
We provide two open-source implementations of the algorithm.
- The first one is based on the GPLAB Matlab package (neatGP-GPLab).
Any doubt, question or comment please contact to Luis Muñoz Delgado (firstname.lastname@example.org)
- The second is based on the DEAP Python package (neatGP-deap)
The software can be downloaded below. Be sure to read the provided README files to understand how to install and use neat-GP within these evolutionary computation libraries.
|Filename||Size||Date & Time|
|neatGP.toolbox.rar||47.38 KB||2017-01-30 07:09:38|
The algorithm is presented in the recently published paper:
neat Genetic Programming: Controlling Bloat Naturally, to appear in Information Sciences.
Bloat is one of the most widely studied phenomena in Genetic Programming (GP), it is normally defined as the increase in mean program size without a corresponding improvement in fitness. Several theories have been proposed in the specialized GP literature that explain why bloat occurs. In particular, the Crossover-Bias Theory states that the cause of bloat is that the distribution of program sizes during evolution is skewed in a way that encourages bloat to appear, by punishing small individuals and favouring larger ones. Therefore, several bloat control methods have been proposed that attempt to explicitly control the size distribution of programs within the evolving population. neat-GP implicitly shapes the program size distribution during a GP run, it is based on two key elements: (a) the NeuroEvolution of Augmenting Topologies algorithm (NEAT), a robust heuristic that was originally developed to evolve neural networks; and (b) the Flat Operator Equalization bloat control method, that explicitly shapes the program size distributions towards a uniform or flat shape.
The goal of this work is to develop a bloat-free GP search, inspired by the insights of Flat-OE and built around the basic features of NEAT, that implicitly shapes the program size distribution. The proposed method is called neat-GP, taking the two evolutionary paradigms on which the proposed algorithm is based on and which serve as namesakes. The remainder of this section presents a detailed description of neat-GP, describing how the general NEAT methodology was ported to the GP domain.
neat-GP use some features from NEAT and they are:
- Initialization and Speciation: we start as same as NEAT with a small and random population, and then to mantain and promove diversity between individuals, we use speciation. For this speciation is required a dissimilarity measure, we call it h.
- Fitness Sharing. After we hace differents species that agroup the population, we evaluate each individual of each specie and we penalize individuals from densely populated species.
- Genetic Operators. neat-GP use the sub-tree mutation, without any modification. In the crossover operator is proposed a similar crossover operator that the original of NEAT.
Figure 1 presents a flowchart of the basic neat-GP algorithm. In general the flowchart describes a basic EA or GP, with initialization, fitness evaluation, selection and survival. However the figure presents several types of processes, using three types of boxes for:
(1) standard GP processes (in some cases with unconventional settings);
(2) a new process based on the NEAT algorithm;
(3) a standard process with a unique implementation based on NEAT.
Here we give a general description of the overall algorithm and then provide a detailed description of each block in the following subsections.
First, the algorithm starts with a randomly generated population, starting with small full trees (block A of Figure 1). Second, speciation is performed on the entire population, based on the size of each solution (block B). Third, fitness evaluation is performed and fitness sharing is applied, to protect individuals within small species and to penalize those from larger species (blocks C and D). Fourth, if the stopping criterion has not been satisfied, then parent selection is performed (block E). Fifth, offspring are generated using the parents selected in the previous step (block F). Moreover, following NEAT, a new crossover operation is proposed called neat-crossover used along with standard subtree mutation (block G). Sixth, the offspring are inserted into the population and used to perform speciation, while their fitness is assigned using fitness sharing based on the current population (blocks H, I and J). Seventh, an elitist survival strategy is used to replace the pworst% solutions in the current population with the best offspring (block K). Finally, the algorithm iterates into the following generation by applying fitness sharing to the new population and evaluating the termination criterion.
|Benchmark Problem||Keijzer - 6|
|Fitness: Training||E [1, 50, 1]|
|Fitness: Test||E [1, 120, 1]|
|GP||Standard GP search used as a control method.|
|FlatOE||Flat operator equalization method|
|neat-GP||The full neat-GP algorithm|
|neat-GP-SC||neat-GP subtree crossover is used instead of neat-crossover|
|neat-GP-Spe||neat-GP without mating restrictions; the desicion in block G is set to NO|
|neat-GP-Sel||neat-GP with tournament selection; block E uses tournament selection to construct Q|
|neat-GP-FS||neat-GP without fitness sharing; blocks D and J are omited|
Population distribution throughout generations of Genetic Programing (GP) and neatGP with variations on key elements of the algorithm, stronger black color means a more dense populations (a higher number of individuals) gray a fewer individuals and white no individual of that size on the population.