no code implementations • 24 Apr 2024 • Diederick Vermetten, Johannes Lengler, Dimitri Rusin, Thomas Bäck, Carola Doerr
Optimization problems in dynamic environments have recently been the source of several theoretical studies.
1 code implementation • 18 Apr 2024 • Johannes Lengler, Konstantin Sturm
The one-fifth rule and its generalizations are a classical parameter control mechanism in discrete domains.
no code implementations • 18 Apr 2024 • Cella Florescu, Marc Kaufmann, Johannes Lengler, Ulysse Schaller
For the classical benchmark OneMax, the cGA has to two different modes of operation: a conservative one with small step sizes $\Theta(1/(\sqrt{n}\log n))$, which is slow but prevents genetic drift, and an aggressive one with large step sizes $\Theta(1/\log n)$, in which genetic drift leads to wrong decisions, but those are corrected efficiently.
no code implementations • 18 Apr 2024 • Sacha Cerf, Johannes Lengler
Our theoretical understanding of crossover is limited by our ability to analyze how population diversity evolves.
1 code implementation • 15 Apr 2024 • Johannes Lengler, Leon Schiller, Oliver Sieberling
We compare the $(1,\lambda)$-EA and the $(1 + \lambda)$-EA on the recently introduced benchmark DisOM, which is the OneMax function with randomly planted local optima.
no code implementations • 10 Apr 2024 • Andre Opris, Johannes Lengler, Dirk Sudholt
This yields an improved and tight time bound of $O(\mu n \log(k) + 4^k/p_c)$ for a range of~$k$ under the mild assumptions $p_c = O(1/k)$ and $\mu \in \Omega(kn)$.
1 code implementation • 13 Nov 2023 • Marc Kaufmann, Maxime Larcher, Johannes Lengler, Oliver Sieberling
Recently, Kaufmann, Larcher, Lengler and Zou conjectured that for the self-adjusting $(1,\lambda)$-EA, Adversarial Dynamic BinVal (ADBV) is the hardest dynamic monotone function to optimize.
no code implementations • 19 Apr 2023 • Johannes Lengler, Andre Opris, Dirk Sudholt
We give an exact formula for the drift of population diversity and show that it is driven towards an equilibrium state.
no code implementations • 19 Apr 2023 • Joost Jorritsma, Johannes Lengler, Dirk Sudholt
For certain parameters, the $(1,\lambda)$ EA finds the target in $\Theta(n \ln n)$ evaluations, with high probability (w. h. p.
no code implementations • 23 Feb 2023 • Carola Doerr, Duri Andrea Janett, Johannes Lengler
In this paper we investigate how this result generalizes if standard bit mutation is replaced by an arbitrary unbiased mutation operator.
1 code implementation • 14 Apr 2022 • Marc Kaufmann, Maxime Larcher, Johannes Lengler, Xun Zou
In this paper we disprove this conjecture and show that OneMax is not the easiest fitness landscape with respect to finding improving steps.
no code implementations • 13 Apr 2022 • Thomas Helmuth, Johannes Lengler, William La Cava
In this paper we investigate why the running time of lexicase parent selection is empirically much lower than its worst-case bound of O(N*C).
1 code implementation • 1 Apr 2022 • Marc Kaufmann, Maxime Larcher, Johannes Lengler, Xun Zou
Recently, Hevia Fajardo and Sudholt have shown that this setup with $c=1$ is efficient on \onemax for $s<1$, but inefficient if $s \ge 18$.
no code implementations • 28 Mar 2022 • Duri Janett, Johannes Lengler
In this paper we show how to use drift analysis in the case of two random variables $X_1, X_2$, when the drift is approximatively given by $A\cdot (X_1, X_2)^T$ for a matrix $A$.
no code implementations • 26 Oct 2020 • Johannes Lengler, Simone Riedi
We study evolutionary algorithms in a dynamic setting, where for each generation a different fitness function is chosen, and selection is performed with respect to the current fitness function.
no code implementations • 21 Apr 2020 • Johannes Lengler, Jonas Meier
In this paper, we study the effect of larger population sizes for Dynamic BinVal, the extremal form of dynamic linear functions.
no code implementations • 30 Jul 2019 • Johannes Lengler, Xun Zou
In particular, it was known that the $(1+1)$-EA and the $(1+\lambda)$-EA can optimize every monotone function in pseudolinear time if the mutation rate is $c/n$ for some $c<1$, but they need exponential time for some monotone functions for $c>2. 2$.
1 code implementation • 7 Feb 2019 • Benjamin Doerr, Carola Doerr, Johannes Lengler
The one-fifth success rule is one of the best-known and most widely accepted techniques to control the parameters of evolutionary algorithms.
no code implementations • 16 Aug 2018 • Hafsteinn Einarsson, Marcelo Matheus Gauy, Johannes Lengler, Florian Meier, Asier Mujika, Angelika Steger, Felix Weissenberger
For the first setup, we give a schedule that achieves a runtime of $(1\pm o(1))\beta n \ln n$, where $\beta \approx 3. 552$, which is an asymptotic improvement over the runtime of the static setup.
no code implementations • 3 Aug 2018 • Johannes Lengler, Anders Martinsson, Angelika Steger
Hillclimbing is an essential part of any optimization algorithm.
no code implementations • 6 Jun 2018 • Benjamin Doerr, Timo Kötzing, J. A. Gregor Lagodzinski, Johannes Lengler
While many optimization problems work with a fixed number of decision variables and thus a fixed-length representation of possible solutions, genetic programming (GP) works on variable-length representations.
2 code implementations • 25 May 2018 • Timo Kötzing, J. A. Gregor Lagodzinski, Johannes Lengler, Anna Melnichenko
We show that the Concatenation Crossover GP can efficiently optimize these test functions, while local search cannot be efficient for all three variants independent of employing bloat control.
no code implementations • 25 Mar 2018 • Johannes Lengler
We study the same question for a large variety of algorithms, particularly for $(1+\lambda)$-EA, $(\mu+1)$-EA, $(\mu+1)$-GA, their fast counterparts like fast $(1+1)$-EA, and for $(1+(\lambda,\lambda))$-GA. We find that all considered mutation-based algorithms show a similar dichotomy for HotTopic functions, or even for all monotone functions.
1 code implementation • 12 Mar 2018 • Tomáš Gavenčiak, Barbara Geissmann, Johannes Lengler
We study sorting of permutations by random swaps if each comparison gives the wrong result with some fixed probability $p<1/2$.
no code implementations • 4 Dec 2017 • Johannes Lengler
Drift analysis is one of the major tools for analysing evolutionary algorithms and nature-inspired search heuristics.
no code implementations • 10 Aug 2016 • Johannes Lengler, Angelika Steger
One of the easiest randomized greedy optimization algorithms is the following evolutionary algorithm which aims at maximizing a boolean function $f:\{0, 1\}^n \to {\mathbb R}$.
no code implementations • 8 Apr 2016 • Carola Doerr, Johannes Lengler
We regard the permutation- and bit-invariant version of \textsc{LeadingOnes} and prove that its (1+1) elitist black-box complexity is $\Omega(n^2)$, a bound that is matched by (1+1)-type evolutionary algorithms.
no code implementations • 27 Aug 2015 • Carola Doerr, Johannes Lengler
Black-box complexity theory provides lower bounds for the runtime of black-box optimizers like evolutionary algorithms and serves as an inspiration for the design of new genetic algorithms.
no code implementations • 10 Apr 2015 • Carola Doerr, Johannes Lengler
Black-box complexity studies lower bounds for the efficiency of general-purpose black-box optimization algorithms such as evolutionary algorithms and other search heuristics.