no code implementations • 17 Apr 2024 • Denis Antipov, Aneta Neumann, Frank Neumann, Andrew M. Sutton
The diversity optimization is the class of optimization problems, in which we aim at finding a diverse set of good solutions.
no code implementations • 9 Apr 2024 • Ishara Hewa Pathiranage, Frank Neumann, Denis Antipov, Aneta Neumann
We introduce a 3-objective formulation that is able to deal with the stochastic and dynamic components at the same time and is independent of the confidence level required for the constraint.
no code implementations • 2 Apr 2024 • Denis Antipov, Benjamin Doerr, Alexandra Ivanova
The only previous result in this direction regarded the less realistic one-bit noise model, required a population size super-linear in the problem size, and proved a runtime guarantee roughly cubic in the noiseless runtime for the OneMax benchmark.
no code implementations • 14 Jul 2023 • Denis Antipov, Aneta Neumann, Frank Neumann
The evolutionary diversity optimization aims at finding a diverse set of solutions which satisfy some constraint on their fitness.
no code implementations • 8 May 2023 • Alexandra Ivanova, Denis Antipov, Benjamin Doerr
Evolutionary algorithms are known to be robust to noise in the evaluation of the fitness.
no code implementations • 12 Apr 2022 • Aneta Neumann, Denis Antipov, Frank Neumann
Our new Pareto Diversity optimization approach uses this bi-objective formulation to optimize the problem while also maintaining an additional population of high quality solutions for which diversity is optimized with respect to a given diversity measure.
no code implementations • 14 Apr 2021 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr
On the other hand, this algorithm is also very efficient on jump functions, where the best static parameters are very different from those necessary to optimize simple problems.
no code implementations • 22 Jun 2020 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr
The mathematical runtime analysis of evolutionary algorithms traditionally regards the time an algorithm needs to find a solution of a certain quality when initialized with a random population.
no code implementations • 5 Jun 2020 • Denis Antipov, Benjamin Doerr
To obtain this performance, however, a non-standard parameter setting depending on the jump size $k$ was used.
no code implementations • 14 Apr 2020 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr
In this first runtime analysis of a crossover-based algorithm using a heavy-tailed choice of the mutation rate, we show an even stronger impact.
no code implementations • 14 Apr 2020 • Denis Antipov, Benjamin Doerr, Vitalii Karavaev
In this work, we conduct the first runtime analysis of this algorithm on a multimodal problem class, the jump functions benchmark.
no code implementations • 15 Apr 2019 • Denis Antipov, Benjamin Doerr, Quentin Yang
Understanding when evolutionary algorithms are efficient or not, and how they efficiently solve problems, is one of the central research tasks in evolutionary computation.
no code implementations • 28 Dec 2018 • Denis Antipov, Benjamin Doerr
In this work, we analyze this long-standing problem and show the asymptotically tight result that the runtime $T$, the number of iterations until the optimum is found, satisfies \[E[T] = \Theta\bigg(\frac{n\log n}{\lambda}+\frac{n}{\lambda / \mu} + \frac{n\log^+\log^+ \lambda/ \mu}{\log^+ \lambda / \mu}\bigg),\] where $\log^+ x := \max\{1, \log x\}$ for all $x > 0$.
no code implementations • 4 Jun 2018 • Denis Antipov, Benjamin Doerr
To gain a better theoretical understanding of how evolutionary algorithms (EAs) cope with plateaus of constant fitness, we propose the $n$-dimensional Plateau$_k$ function as natural benchmark and analyze how different variants of the $(1 + 1)$ EA optimize it.