You need to log in to edit.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

You can create a new account if you don't have one.

Or, discuss a change on Slack.

no code implementations • 16 Dec 2021 • Weijie Zheng, Yufei Liu, Benjamin Doerr

The non-dominated sorting genetic algorithm II (NSGA-II) is the most intensively used multi-objective evolutionary algorithm (MOEA) in real-world applications.

no code implementations • 14 Sep 2021 • Shouda Wang, Weijie Zheng, Benjamin Doerr

Our finding that the unary unbiased black-box complexity is only $O(n^2)$ suggests the Metropolis algorithm as an interesting candidate and we prove that it solves the DLB problem in quadratic time.

no code implementations • 7 May 2021 • Henry Bambury, Antoine Bultel, Benjamin Doerr

We prove that several previous results extend to this more general class: for all {$k \le \frac{n^{1/3}}{\ln{n}}$} and $\delta < k$, the optimal mutation rate for the $(1+1)$~EA is $\frac{\delta}{n}$, and the fast $(1+1)$~EA runs faster than the classical $(1+1)$~EA by a factor super-exponential in $\delta$.

no code implementations • 14 Apr 2021 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr

Most evolutionary algorithms have multiple parameters and their values drastically affect the performance.

no code implementations • 7 Apr 2021 • Benjamin Doerr, Timo Kötzing

One of the first and easy to use techniques for proving run time bounds for evolutionary algorithms is the so-called method of fitness levels by Wegener.

no code implementations • 1 Feb 2021 • Quentin Renau, Johann Dreo, Carola Doerr, Benjamin Doerr

We show that the classification accuracy transfers to settings in which several instances are involved in training and testing.

no code implementations • 14 Dec 2020 • Benjamin Doerr, Weijie Zheng

To improve the performance, we combine the GSEMO with two approaches, a heavy-tailed mutation operator and a stagnation detection strategy, that showed advantages in single-objective multi-modal problems.

no code implementations • 16 Jul 2020 • Benjamin Doerr, Martin S. Krejca

In their recent work, Lehre and Nguyen (FOGA 2019) show that the univariate marginal distribution algorithm (UMDA) needs time exponential in the parent populations size to optimize the DeceptiveLeadingBlocks (DLB) problem.

no code implementations • 30 Jun 2020 • Benjamin Doerr, Frank Neumann

The theory of evolutionary computation for discrete search spaces has made significant progress in the last ten years.

no code implementations • 22 Jun 2020 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr

This could suggest that evolutionary algorithms better exploiting good initial solutions are still to be found.

no code implementations • 19 Jun 2020 • Quentin Renau, Carola Doerr, Johann Dreo, Benjamin Doerr

While, not unexpectedly, increasing the number of sample points gives more robust estimates for the feature values, to our surprise we find that the feature value approximations for different sampling strategies do not converge to the same value.

no code implementations • 8 Jun 2020 • Benjamin Doerr

We use an elementary argument building on group actions to prove that the selection-free steady state genetic algorithm analyzed by Sutton and Witt (GECCO 2019) takes an expected number of $\Omega(2^n / \sqrt n)$ iterations to find any particular target search point.

no code implementations • 5 Jun 2020 • Denis Antipov, Benjamin Doerr

To obtain this performance, however, a non-standard parameter setting depending on the jump size $k$ was used.

no code implementations • 2 May 2020 • Benjamin Doerr

We discuss in more detail Lehre's (PPSN 2010) \emph{negative drift in populations} method, one of the most general tools to prove lower bounds on the runtime of non-elitist mutation-based evolutionary algorithms for discrete search spaces.

no code implementations • 20 Apr 2020 • Maxim Buzdalov, Benjamin Doerr, Carola Doerr, Dmitry Vinokurov

In this work, we conduct an in-depth study on the advantages and the limitations of fixed-target analyses.

no code implementations • 15 Apr 2020 • Benjamin Doerr, Weijie Zheng

One of the key difficulties in using estimation-of-distribution algorithms is choosing the population size(s) appropriately: Too small values lead to genetic drift, which can cause enormous difficulties.

no code implementations • 14 Apr 2020 • Denis Antipov, Benjamin Doerr, Vitalii Karavaev

In this work, we conduct the first runtime analysis of this algorithm on a multimodal problem class, the jump functions benchmark.

no code implementations • 14 Apr 2020 • Denis Antipov, Maxim Buzdalov, Benjamin Doerr

In this first runtime analysis of a crossover-based algorithm using a heavy-tailed choice of the mutation rate, we show an even stronger impact.

no code implementations • 13 Apr 2020 • Benjamin Doerr

We argue that proven exponential upper bounds on runtimes, an established area in classic algorithms, are interesting also in heuristic search and we prove several such results.

no code implementations • 10 Apr 2020 • Benjamin Doerr, Martin Krejca

With elementary means, we prove a stronger run time guarantee for the univariate marginal distribution algorithm (UMDA) optimizing the LeadingOnes benchmark function in the desirable regime with low genetic drift.

no code implementations • 2 Apr 2020 • Benjamin Doerr

This is the first runtime result for a non-elitist algorithm on a multi-modal problem that is tight apart from lower order terms.

no code implementations • 26 Nov 2019 • Benjamin Doerr, Carola Doerr, Aneta Neumann, Frank Neumann, Andrew M. Sutton

In this paper, we investigate submodular optimization problems with chance constraints.

no code implementations • 31 Oct 2019 • Benjamin Doerr, Weijie Zheng

This paper further proves that for PBIL with parameters $\mu$, $\lambda$, and $\rho$, in an expected number of $\Theta(\mu/\rho^2)$ iterations the sampling frequency of a neutral bit leaves the interval $[\Theta(\rho/\mu), 1-\Theta(\rho/\mu)]$ and then always the same value is sampled for this bit, that is, the frequency approaches the corresponding boundary value with maximum speed.

no code implementations • 18 Aug 2019 • Benjamin Doerr

We prove that any choice of the hypothetical population size leads to a runtime that, with high probability, is at least exponential in the jump size $k$.

no code implementations • 17 Apr 2019 • Benjamin Doerr

In this work, we show that any choice of the hypothetical population size leads to a runtime that, with high probability, is at least exponential in the jump size.

no code implementations • 15 Apr 2019 • Denis Antipov, Benjamin Doerr, Quentin Yang

Understanding when evolutionary algorithms are efficient or not, and how they efficiently solve problems, is one of the central research tasks in evolutionary computation.

no code implementations • 11 Apr 2019 • Benjamin Doerr, Timo Kötzing

Drift analysis aims at translating the expected progress of an evolutionary algorithm (or more generally, a random process) into a probabilistic guarantee on its run time (hitting time).

no code implementations • 28 Mar 2019 • Benjamin Doerr, Andrei Lissovoi, Pietro S. Oliveto

Recently it has been proved that simple GP systems can efficiently evolve the conjunction of $n$ variables if they are equipped with the minimal required components.

no code implementations • 26 Mar 2019 • Benjamin Doerr

We prove that the compact genetic algorithm (cGA) with hypothetical population size $\mu = \Omega(\sqrt n \log n) \cap \text{poly}(n)$ with high probability finds the optimum of any $n$-dimensional jump function with jump size $k < \frac 1 {20} \ln n$ in $O(\mu \sqrt n)$ iterations.

1 code implementation • 7 Feb 2019 • Benjamin Doerr, Carola Doerr, Johannes Lengler

The one-fifth success rule is one of the best-known and most widely accepted techniques to control the parameters of evolutionary algorithms.

no code implementations • 1 Feb 2019 • Benjamin Doerr, Carola Doerr, Frank Neumann

We propose a simple diversity mechanism that prevents this behavior, thereby reducing the re-optimization time for LeadingOnes to $O(\gamma\delta n)$, where $\gamma$ is the population size used by the diversity mechanism and $\delta \le \gamma$ the Hamming distance of the new optimum from the previous solution.

no code implementations • 28 Dec 2018 • Denis Antipov, Benjamin Doerr

In this work, we analyze this long-standing problem and show the asymptotically tight result that the runtime $T$, the number of iterations until the optimum is found, satisfies \[E[T] = \Theta\bigg(\frac{n\log n}{\lambda}+\frac{n}{\lambda / \mu} + \frac{n\log^+\log^+ \lambda/ \mu}{\log^+ \lambda / \mu}\bigg),\] where $\log^+ x := \max\{1, \log x\}$ for all $x > 0$.

no code implementations • 20 Dec 2018 • Peyman Afshani, Manindra Agrawal, Benjamin Doerr, Carola Doerr, Kasper Green Larsen, Kurt Mehlhorn

We study the query complexity of a permutation-based variant of the guessing game Mastermind.

no code implementations • 9 Dec 2018 • Benjamin Doerr, Weijie Zheng

On the technical side, we observe that the strong stochastic dependencies in the random experiment describing a run of BDE prevent us from proving all desired results with the mathematical rigor that was successfully used in the analysis of other evolutionary algorithms.

no code implementations • 30 Nov 2018 • Benjamin Doerr, Carsten Witt, Jing Yang

We propose and analyze a self-adaptive version of the $(1,\lambda)$ evolutionary algorithm in which the current mutation rate is part of the individual and thus also subject to mutation.

no code implementations • 10 Jul 2018 • Benjamin Doerr, Martin Krejca

Estimation-of-distribution algorithms (EDAs) are randomized search heuristics that create a probabilistic model of the solution space, which is updated iteratively, based on the quality of the solutions sampled according to the model.

no code implementations • 9 Jul 2018 • Benjamin Doerr, Carola Doerr, Jing Yang

It has been observed that some working principles of evolutionary algorithms, in particular, the influence of the parameters, cannot be understood from results on the asymptotic order of the runtime, but only from more precise results.

no code implementations • 6 Jun 2018 • Benjamin Doerr, Timo Kötzing, J. A. Gregor Lagodzinski, Johannes Lengler

While many optimization problems work with a fixed number of decision variables and thus a fixed-length representation of possible solutions, genetic programming (GP) works on variable-length representations.

no code implementations • 4 Jun 2018 • Denis Antipov, Benjamin Doerr

To gain a better theoretical understanding of how evolutionary algorithms (EAs) cope with plateaus of constant fitness, we propose the $n$-dimensional Plateau$_k$ function as natural benchmark and analyze how different variants of the $(1 + 1)$ EA optimize it.

no code implementations • 16 Apr 2018 • Benjamin Doerr, Carola Doerr

Parameter control aims at realizing performance gains through a dynamic choice of the parameters which determine the behavior of the underlying optimization algorithm.

no code implementations • 20 Jan 2018 • Benjamin Doerr

This chapter collects several probabilistic tools that proved to be useful in the analysis of randomized search heuristics.

no code implementations • 13 Jan 2018 • Benjamin Doerr

Apart from few exceptions, the mathematical runtime analysis of evolutionary algorithms is mostly concerned with expected runtimes.

no code implementations • 1 Dec 2017 • Benjamin Doerr

We give an elementary proof of the fact that a binomial random variable $X$ with parameters $n$ and $0. 29/n \le p < 1$ with probability at least $1/4$ strictly exceeds its expectation.

no code implementations • 14 Apr 2017 • Maxim Buzdalov, Benjamin Doerr

We show that this problem can be overcome by equipping the self-adjusting GA with an upper limit for the population size.

no code implementations • 7 Apr 2017 • Benjamin Doerr, Christian Gießen, Carsten Witt, Jing Yang

We propose a new way to self-adjust the mutation rate in population-based evolutionary algorithms in discrete search spaces.

2 code implementations • 9 Mar 2017 • Benjamin Doerr, Huu Phuoc Le, Régis Makhmara, Ta Duy Nguyen

We prove that the $(1+1)$ EA with this heavy-tailed mutation rate optimizes any $\jump_{m, n}$ function in a time that is only a small polynomial (in~$m$) factor above the one stemming from the optimal rate for this $m$.

no code implementations • 12 Apr 2016 • Benjamin Doerr, Carola Doerr, Timo Kötzing

The most common representation in evolutionary computation are bit strings.

no code implementations • 4 Apr 2016 • Benjamin Doerr

The $(1+(\lambda,\lambda))$ genetic algorithm is one of the few algorithms for which a super-constant speed-up through the use of crossover could be proven.

no code implementations • 19 Jun 2015 • Benjamin Doerr, Carola Doerr

We first improve the upper bound on the runtime to $O(\max\{n\log(n)/\lambda, n\lambda \log\log(\lambda)/\log(\lambda)\})$.

no code implementations • 19 Jun 2015 • Benjamin Doerr, Carola Doerr, Timo Kötzing

For their setting, in which the solution length is sampled from a geometric distribution, we provide mutation rates that yield an expected optimization time that is of the same order as that of the (1+1) EA knowing the solution length.

no code implementations • 15 Jun 2015 • Laurent Hoeltgen, Markus Mainberger, Sebastian Hoffmann, Joachim Weickert, Ching Hoo Tang, Simon Setzer, Daniel Johannsen, Frank Neumann, Benjamin Doerr

Moreover, is more generic than other data optimisation approaches for the sparse inpainting problem, since it can also be extended to nonlinear inpainting operators such as EED.

no code implementations • 13 Apr 2015 • Benjamin Doerr, Carola Doerr

While evolutionary algorithms are known to be very successful for a broad range of applications, the algorithm designer is often left with many algorithmic choices, for example, the size of the population, the mutation rates, and the crossover rates of the algorithm.

no code implementations • 30 Mar 2014 • Benjamin Doerr, Carola Doerr, Timo Kötzing

We analyze the unbiased black-box complexity of jump functions with small, medium, and large sizes of the fitness plateau surrounding the optimal solution.

no code implementations • 29 Aug 2013 • Benjamin Doerr, Carola Doerr

Motivated by a problem in the theory of randomized search heuristics, we give a very precise analysis for the coupon collector problem where the collector starts with a random set of coupons (chosen uniformly from all sets).

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.