Generalized Self-Adapting Particle Swarm Optimization algorithm with archive of samples

28 Feb 2020  ·  Michał Okulewicz, Mateusz Zaborski, Jacek Mańdziuk ·

In this paper we enhance Generalized Self-Adapting Particle Swarm Optimization algorithm (GAPSO), initially introduced at the Parallel Problem Solving from Nature 2018 conference, and to investigate its properties. The research on GAPSO is underlined by the two following assumptions: (1) it is possible to achieve good performance of an optimization algorithm through utilization of all of the gathered samples, (2) the best performance can be accomplished by means of a combination of specialized sampling behaviors (Particle Swarm Optimization, Differential Evolution, and locally fitted square functions). From a software engineering point of view, GAPSO considers a standard Particle Swarm Optimization algorithm as an ideal starting point for creating a generalpurpose global optimization framework. Within this framework hybrid optimization algorithms are developed, and various additional techniques (like algorithm restart management or adaptation schemes) are tested. The paper introduces a new version of the algorithm, abbreviated as M-GAPSO. In comparison with the original GAPSO formulation it includes the following four features: a global restart management scheme, samples gathering within an R-Tree based index (archive/memory of samples), adaptation of a sampling behavior based on a global particle performance, and a specific approach to local search. The above-mentioned enhancements resulted in improved performance of M-GAPSO over GAPSO, observed on both COCO BBOB testbed and in the black-box optimization competition BBComp. Also, for lower dimensionality functions (up to 5D) results of M-GAPSO are better or comparable to the state-of-the art version of CMA-ES (namely the KL-BIPOP-CMA-ES algorithm presented at the GECCO 2017 conference).

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here