Search Results for author: Julius Kunze

Found 8 papers, 4 papers with code

Adaptive Optimization with Examplewise Gradients

1 code implementation30 Nov 2021 Julius Kunze, James Townsend, David Barber

We propose a new, more general approach to the design of stochastic gradient-based optimization methods for machine learning.

BIG-bench Machine Learning

HiLLoC: Lossless Image Compression with Hierarchical Latent Variable Models

1 code implementation ICLR 2020 James Townsend, Thomas Bird, Julius Kunze, David Barber

We make the following striking observation: fully convolutional VAE models trained on 32x32 ImageNet can generalize well, not just to 64x64 but also to far larger photographs, with no changes to the model.

Image Compression

Gaussian Mean Field Regularizes by Limiting Learned Information

no code implementations12 Feb 2019 Julius Kunze, Louis Kirsch, Hippolyt Ritter, David Barber

Variational inference with a factorized Gaussian posterior estimate is a widely used approach for learning parameters and hidden variables.

Variational Inference

Noisy Information Bottlenecks for Generalization

no code implementations27 Sep 2018 Julius Kunze, Louis Kirsch, Hippolyt Ritter, David Barber

We propose Noisy Information Bottlenecks (NIB) to limit mutual information between learned parameters and the data through noise.

Stochastic Variational Optimization

no code implementations13 Sep 2018 Thomas Bird, Julius Kunze, David Barber

These approaches are of particular interest because they are parallelizable.

Cannot find the paper you are looking for? You can Submit a new open access paper.