Search Results for author: Christoph Hertrich

Found 11 papers, 2 papers with code

Depth-Bounds for Neural Networks via the Braid Arrangement

no code implementations13 Feb 2025 Moritz Grillo, Christoph Hertrich, Georg Loho

We contribute towards resolving the open question of how many hidden layers are required in ReLU networks for exactly representing all continuous and piecewise linear functions on $\mathbb{R}^d$.

Neural Networks and (Virtual) Extended Formulations

no code implementations5 Nov 2024 Christoph Hertrich, Georg Loho

In an attempt to prove similar bounds also for general neural networks, we introduce the notion of virtual extension complexity $\mathrm{vxc}(P)$, which generalizes $\mathrm{xc}(P)$ and describes the number of inequalities needed to represent the linear optimization problem over $P$ as a difference of two linear programs.

Combinatorial Optimization

Decomposition Polyhedra of Piecewise Linear Functions

no code implementations7 Oct 2024 Marie-Charlotte Brandenburg, Moritz Grillo, Christoph Hertrich

Finally, we improve upon previous constructions of neural networks for a given convex CPWL function and apply our framework to obtain results in the nonconvex case.

Mode Connectivity in Auction Design

no code implementations NeurIPS 2023 Christoph Hertrich, Yixin Tao, László A. Végh

Mode connectivity has been recently investigated as an intriguing empirical and theoretically justifiable property of neural networks used for prediction problems.

Training Neural Networks is NP-Hard in Fixed Dimension

no code implementations NeurIPS 2023 Vincent Froese, Christoph Hertrich

We also answer a question by Froese et al. [JAIR '22] proving W[1]-hardness for four ReLUs (or two linear threshold neurons) with zero training error.

Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes

no code implementations24 Feb 2023 Christian Haase, Christoph Hertrich, Georg Loho

We prove that the set of functions representable by ReLU neural networks with integer weights strictly increases with the network depth while allowing arbitrary width.

Training Fully Connected Neural Networks is $\exists\mathbb{R}$-Complete

no code implementations NeurIPS 2023 Daniel Bertschinger, Christoph Hertrich, Paul Jungeblut, Tillmann Miltzow, Simon Weber

We consider the problem of finding weights and biases for a two-layer fully connected neural network to fit a given set of data points as well as possible, also known as EmpiricalRiskMinimization.

Towards Lower Bounds on the Depth of ReLU Neural Networks

1 code implementation NeurIPS 2021 Christoph Hertrich, Amitabh Basu, Marco Di Summa, Martin Skutella

We contribute to a better understanding of the class of functions that can be represented by a neural network with ReLU activations and a given architecture.

The Computational Complexity of ReLU Network Training Parameterized by Data Dimensionality

no code implementations18 May 2021 Vincent Froese, Christoph Hertrich, Rolf Niedermeier

In particular, we extend a known polynomial-time algorithm for constant $d$ and convex loss functions to a more general class of loss functions, matching our running time lower bounds also in these cases.

ReLU Neural Networks of Polynomial Size for Exact Maximum Flow Computation

no code implementations12 Feb 2021 Christoph Hertrich, Leon Sering

This paper studies the expressive power of artificial neural networks with rectified linear units.

ARC Combinatorial Optimization

Provably Good Solutions to the Knapsack Problem via Neural Networks of Bounded Size

1 code implementation28 May 2020 Christoph Hertrich, Martin Skutella

The development of a satisfying and rigorous mathematical understanding of the performance of neural networks is a major challenge in artificial intelligence.

Combinatorial Optimization

Cannot find the paper you are looking for? You can Submit a new open access paper.