Runtime Analysis of a Heavy-Tailed $(1+(λ,λ))$ Genetic Algorithm on Jump Functions

5 Jun 2020  ·  Denis Antipov, Benjamin Doerr ·

It was recently observed that the $(1+(\lambda,\lambda))$ genetic algorithm can comparably easily escape the local optimum of the jump functions benchmark. Consequently, this algorithm can optimize the jump function with jump size $k$ in an expected runtime of only $n^{(k + 1)/2}k^{-k/2}e^{O(k)}$ fitness evaluations (Antipov, Doerr, Karavaev (GECCO 2020)). To obtain this performance, however, a non-standard parameter setting depending on the jump size $k$ was used. To overcome this difficulty, we propose to choose two parameters of the $(1+(\lambda,\lambda))$ genetic algorithm randomly from a power-law distribution. Via a mathematical runtime analysis, we show that this algorithm with natural instance-independent choices of the distribution parameters on all jump functions with jump size at most $n/4$ has a performance close to what the best instance-specific parameters in the previous work obtained. This price for instance-independence can be made as small as an $O(n\log(n))$ factor. Given the difficulty of the jump problem and the runtime losses from using mildly suboptimal fixed parameters (also discussed in this work), this appears to be a fair price.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here