Selection Heuristics on Semantic Genetic Programming for Classification Problems

16 Jul 2019  ·  Claudia N. Sánchez, Mario Graff ·

Individual's semantics have been used for guiding the learning process of Genetic Programming solving supervised learning problems. The semantics has been used to proposed novel genetic operators as well as different ways of performing parent selection. The latter is the focus of this contribution by proposing three heuristics for parent selection that replace the fitness function on the selection mechanism entirely. These heuristics complement previous work by being inspired in the characteristics of the addition, Naive Bayes, and Nearest Centroid functions and applying them only when the function is used to create an offspring. These heuristics use different similarity measures among the parents to decide which of them is more appropriate given a function. The similarity functions considered are the cosine similarity, Pearson's correlation, and agreement. We analyze these heuristics' performance against random selection, state-of-the-art selection schemes, and 18 classifiers, including auto-machine-learning techniques, on 30 classification problems with a variable number of samples, variables, and classes. The result indicated that the combination of parent selection based on agreement and random selection to replace an individual in the population produces statistically better results than the classical selection and state-of-the-art schemes, and it is competitive with state-of-the-art classifiers. Finally, the code is released as open-source software.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here