Search Results for author: Johannes Lengler

Found 29 papers, 8 papers with code

Empirical Analysis of the Dynamic Binary Value Problem with IOHprofiler

no code implementations24 Apr 2024 Diederick Vermetten, Johannes Lengler, Dimitri Rusin, Thomas Bäck, Carola Doerr

Optimization problems in dynamic environments have recently been the source of several theoretical studies.

Benchmarking

Self-Adjusting Evolutionary Algorithms Are Slow on Multimodal Landscapes

1 code implementation18 Apr 2024 Johannes Lengler, Konstantin Sturm

The one-fifth rule and its generalizations are a classical parameter control mechanism in discrete domains.

Evolutionary Algorithms

Faster Optimization Through Genetic Drift

no code implementations18 Apr 2024 Cella Florescu, Marc Kaufmann, Johannes Lengler, Ulysse Schaller

For the classical benchmark OneMax, the cGA has to two different modes of operation: a conservative one with small step sizes $\Theta(1/(\sqrt{n}\log n))$, which is slow but prevents genetic drift, and an aggressive one with large step sizes $\Theta(1/\log n)$, in which genetic drift leads to wrong decisions, but those are corrected efficiently.

How Population Diversity Influences the Efficiency of Crossover

no code implementations18 Apr 2024 Sacha Cerf, Johannes Lengler

Our theoretical understanding of crossover is limited by our ability to analyze how population diversity evolves.

Diversity

Plus Strategies are Exponentially Slower for Planted Optima of Random Height

1 code implementation15 Apr 2024 Johannes Lengler, Leon Schiller, Oliver Sieberling

We compare the $(1,\lambda)$-EA and the $(1 + \lambda)$-EA on the recently introduced benchmark DisOM, which is the OneMax function with randomly planted local optima.

A Tight $O(4^k/p_c)$ Runtime Bound for a ($μ$+1) GA on Jump$_k$ for Realistic Crossover Probabilities

no code implementations10 Apr 2024 Andre Opris, Johannes Lengler, Dirk Sudholt

This yields an improved and tight time bound of $O(\mu n \log(k) + 4^k/p_c)$ for a range of~$k$ under the mild assumptions $p_c = O(1/k)$ and $\mu \in \Omega(kn)$.

Diversity Evolutionary Algorithms

Hardest Monotone Functions for Evolutionary Algorithms

1 code implementation13 Nov 2023 Marc Kaufmann, Maxime Larcher, Johannes Lengler, Oliver Sieberling

Recently, Kaufmann, Larcher, Lengler and Zou conjectured that for the self-adjusting $(1,\lambda)$-EA, Adversarial Dynamic BinVal (ADBV) is the hardest dynamic monotone function to optimize.

Evolutionary Algorithms

Analysing Equilibrium States for Population Diversity

no code implementations19 Apr 2023 Johannes Lengler, Andre Opris, Dirk Sudholt

We give an exact formula for the drift of population diversity and show that it is driven towards an equilibrium state.

Diversity Evolutionary Algorithms

Comma Selection Outperforms Plus Selection on OneMax with Randomly Planted Optima

no code implementations19 Apr 2023 Joost Jorritsma, Johannes Lengler, Dirk Sudholt

For certain parameters, the $(1,\lambda)$ EA finds the target in $\Theta(n \ln n)$ evaluations, with high probability (w. h. p.

Evolutionary Algorithms

Tight Runtime Bounds for Static Unary Unbiased Evolutionary Algorithms on Linear Functions

no code implementations23 Feb 2023 Carola Doerr, Duri Andrea Janett, Johannes Lengler

In this paper we investigate how this result generalizes if standard bit mutation is replaced by an arbitrary unbiased mutation operator.

Evolutionary Algorithms

OneMax is not the Easiest Function for Fitness Improvements

1 code implementation14 Apr 2022 Marc Kaufmann, Maxime Larcher, Johannes Lengler, Xun Zou

In this paper we disprove this conjecture and show that OneMax is not the easiest fitness landscape with respect to finding improving steps.

Population Diversity Leads to Short Running Times of Lexicase Selection

no code implementations13 Apr 2022 Thomas Helmuth, Johannes Lengler, William La Cava

In this paper we investigate why the running time of lexicase parent selection is empirically much lower than its worst-case bound of O(N*C).

Diversity Program Synthesis

Self-adjusting Population Sizes for the $(1, λ)$-EA on Monotone Functions

1 code implementation1 Apr 2022 Marc Kaufmann, Maxime Larcher, Johannes Lengler, Xun Zou

Recently, Hevia Fajardo and Sudholt have shown that this setup with $c=1$ is efficient on \onemax for $s<1$, but inefficient if $s \ge 18$.

Two-Dimensional Drift Analysis: Optimizing Two Functions Simultaneously Can Be Hard

no code implementations28 Mar 2022 Duri Janett, Johannes Lengler

In this paper we show how to use drift analysis in the case of two random variables $X_1, X_2$, when the drift is approximatively given by $A\cdot (X_1, X_2)^T$ for a matrix $A$.

Vocal Bursts Valence Prediction

Runtime analysis of the (mu+1)-EA on the Dynamic BinVal function

no code implementations26 Oct 2020 Johannes Lengler, Simone Riedi

We study evolutionary algorithms in a dynamic setting, where for each generation a different fitness function is chosen, and selection is performed with respect to the current fitness function.

Evolutionary Algorithms

Large Population Sizes and Crossover Help in Dynamic Environments

no code implementations21 Apr 2020 Johannes Lengler, Jonas Meier

In this paper, we study the effect of larger population sizes for Dynamic BinVal, the extremal form of dynamic linear functions.

Exponential Slowdown for Larger Populations: The $(μ+1)$-EA on Monotone Functions

no code implementations30 Jul 2019 Johannes Lengler, Xun Zou

In particular, it was known that the $(1+1)$-EA and the $(1+\lambda)$-EA can optimize every monotone function in pseudolinear time if the mutation rate is $c/n$ for some $c<1$, but they need exponential time for some monotone functions for $c>2. 2$.

Evolutionary Algorithms

Self-Adjusting Mutation Rates with Provably Optimal Success Rules

1 code implementation7 Feb 2019 Benjamin Doerr, Carola Doerr, Johannes Lengler

The one-fifth success rule is one of the best-known and most widely accepted techniques to control the parameters of evolutionary algorithms.

Evolutionary Algorithms

The linear hidden subset problem for the (1+1) EA with scheduled and adaptive mutation rates

no code implementations16 Aug 2018 Hafsteinn Einarsson, Marcelo Matheus Gauy, Johannes Lengler, Florian Meier, Asier Mujika, Angelika Steger, Felix Weissenberger

For the first setup, we give a schedule that achieves a runtime of $(1\pm o(1))\beta n \ln n$, where $\beta \approx 3. 552$, which is an asymptotic improvement over the runtime of the static setup.

Evolutionary Algorithms

Bounding Bloat in Genetic Programming

no code implementations6 Jun 2018 Benjamin Doerr, Timo Kötzing, J. A. Gregor Lagodzinski, Johannes Lengler

While many optimization problems work with a fixed number of decision variables and thus a fixed-length representation of possible solutions, genetic programming (GP) works on variable-length representations.

Destructiveness of Lexicographic Parsimony Pressure and Alleviation by a Concatenation Crossover in Genetic Programming

2 code implementations25 May 2018 Timo Kötzing, J. A. Gregor Lagodzinski, Johannes Lengler, Anna Melnichenko

We show that the Concatenation Crossover GP can efficiently optimize these test functions, while local search cannot be efficient for all three variants independent of employing bloat control.

A General Dichotomy of Evolutionary Algorithms on Monotone Functions

no code implementations25 Mar 2018 Johannes Lengler

We study the same question for a large variety of algorithms, particularly for $(1+\lambda)$-EA, $(\mu+1)$-EA, $(\mu+1)$-GA, their fast counterparts like fast $(1+1)$-EA, and for $(1+(\lambda,\lambda))$-GA. We find that all considered mutation-based algorithms show a similar dichotomy for HotTopic functions, or even for all monotone functions.

Evolutionary Algorithms

Sorting by Swaps with Noisy Comparisons

1 code implementation12 Mar 2018 Tomáš Gavenčiak, Barbara Geissmann, Johannes Lengler

We study sorting of permutations by random swaps if each comparison gives the wrong result with some fixed probability $p<1/2$.

Drift Analysis

no code implementations4 Dec 2017 Johannes Lengler

Drift analysis is one of the major tools for analysing evolutionary algorithms and nature-inspired search heuristics.

Evolutionary Algorithms

Drift Analysis and Evolutionary Algorithms Revisited

no code implementations10 Aug 2016 Johannes Lengler, Angelika Steger

One of the easiest randomized greedy optimization algorithms is the following evolutionary algorithm which aims at maximizing a boolean function $f:\{0, 1\}^n \to {\mathbb R}$.

Evolutionary Algorithms

The (1+1) Elitist Black-Box Complexity of LeadingOnes

no code implementations8 Apr 2016 Carola Doerr, Johannes Lengler

We regard the permutation- and bit-invariant version of \textsc{LeadingOnes} and prove that its (1+1) elitist black-box complexity is $\Omega(n^2)$, a bound that is matched by (1+1)-type evolutionary algorithms.

Evolutionary Algorithms

Introducing Elitist Black-Box Models: When Does Elitist Selection Weaken the Performance of Evolutionary Algorithms?

no code implementations27 Aug 2015 Carola Doerr, Johannes Lengler

Black-box complexity theory provides lower bounds for the runtime of black-box optimizers like evolutionary algorithms and serves as an inspiration for the design of new genetic algorithms.

Evolutionary Algorithms

OneMax in Black-Box Models with Several Restrictions

no code implementations10 Apr 2015 Carola Doerr, Johannes Lengler

Black-box complexity studies lower bounds for the efficiency of general-purpose black-box optimization algorithms such as evolutionary algorithms and other search heuristics.

Evolutionary Algorithms

Cannot find the paper you are looking for? You can Submit a new open access paper.