Search Results for author: Christoph Hertrich

Found 7 papers, 2 papers with code

Training Neural Networks is NP-Hard in Fixed Dimension

no code implementations NeurIPS 2023 Vincent Froese, Christoph Hertrich

We also answer a question by Froese et al. [JAIR '22] proving W[1]-hardness for four ReLUs (or two linear threshold neurons) with zero training error.

Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes

no code implementations24 Feb 2023 Christian Haase, Christoph Hertrich, Georg Loho

We prove that the set of functions representable by ReLU neural networks with integer weights strictly increases with the network depth while allowing arbitrary width.

Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete

no code implementations NeurIPS 2023 Daniel Bertschinger, Christoph Hertrich, Paul Jungeblut, Tillmann Miltzow, Simon Weber

We consider the problem of finding weights and biases for a two-layer fully connected neural network to fit a given set of data points as well as possible, also known as EmpiricalRiskMinimization.

Towards Lower Bounds on the Depth of ReLU Neural Networks

1 code implementation NeurIPS 2021 Christoph Hertrich, Amitabh Basu, Marco Di Summa, Martin Skutella

We contribute to a better understanding of the class of functions that can be represented by a neural network with ReLU activations and a given architecture.

The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality

no code implementations18 May 2021 Vincent Froese, Christoph Hertrich, Rolf Niedermeier

In particular, we extend a known polynomial-time algorithm for constant $d$ and convex loss functions to a more general class of loss functions, matching our running time lower bounds also in these cases.

ReLU Neural Networks of Polynomial Size for Exact Maximum Flow Computation

no code implementations12 Feb 2021 Christoph Hertrich, Leon Sering

This paper studies the expressive power of artificial neural networks with rectified linear units.

Combinatorial Optimization

Provably Good Solutions to the Knapsack Problem via Neural Networks of Bounded Size

1 code implementation28 May 2020 Christoph Hertrich, Martin Skutella

The development of a satisfying and rigorous mathematical understanding of the performance of neural networks is a major challenge in artificial intelligence.

Combinatorial Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.