Search Results for author: Sjoerd Dirksen

Found 5 papers, 1 papers with code

Memorization with neural nets: going beyond the worst case

1 code implementation30 Sep 2023 Sjoerd Dirksen, Patrick Finke, Martin Genzel

In practice, deep neural networks are often able to easily interpolate their training data.

Memorization

The Separation Capacity of Random Neural Networks

no code implementations NeurIPS 2023 Sjoerd Dirksen, Martin Genzel, Laurent Jacques, Alexander Stollenwerk

Neural networks with random weights appear in a variety of machine learning applications, most prominently as the initialization of many deep learning algorithms and as a computationally cheap alternative to fully learned neural networks.

Memorization

Statistical post-processing of wind speed forecasts using convolutional neural networks

no code implementations8 Jul 2020 Simon Veldkamp, Kirien Whan, Sjoerd Dirksen, Maurice Schmeits

Current statistical post-processing methods for probabilistic weather forecasting are not capable of using full spatial patterns from the numerical weather prediction (NWP) model.

Density Estimation Weather Forecasting

Dimensionality reduction with subgaussian matrices: a unified theory

no code implementations17 Feb 2014 Sjoerd Dirksen

We present a theory for Euclidean dimensionality reduction with subgaussian matrices which unifies several restricted isometry property and Johnson-Lindenstrauss type results obtained earlier for specific data sets.

Dimensionality Reduction

Toward a unified theory of sparse dimensionality reduction in Euclidean space

no code implementations11 Nov 2013 Jean Bourgain, Sjoerd Dirksen, Jelani Nelson

Let $\Phi\in\mathbb{R}^{m\times n}$ be a sparse Johnson-Lindenstrauss transform [KN14] with $s$ non-zeroes per column.

Dimensionality Reduction LEMMA

Cannot find the paper you are looking for? You can Submit a new open access paper.