no code implementations • 26 Jan 2023 • Matthew J. Muckley, Alaaeldin El-Nouby, Karen Ullrich, Hervé Jégou, Jakob Verbeek
Lossy image compression aims to represent images in as few bits as possible while maintaining fidelity to the original.
no code implementations • 28 Dec 2022 • Ricky T. Q. Chen, Matthew Le, Matthew Muckley, Maximilian Nickel, Karen Ullrich
We empirically verify our approach on multiple domains involving compression of video and motion capture sequences, showing that our approaches can automatically achieve reductions in bit rates by learning how to discretize.
no code implementations • 14 Dec 2022 • Alaaeldin El-Nouby, Matthew J. Muckley, Karen Ullrich, Ivan Laptev, Jakob Verbeek, Hervé Jégou
In this work, we attempt to bring these lines of research closer by revisiting vector quantization for image compression.
no code implementations • 2 Nov 2022 • Julius Berner, Lorenz Richter, Karen Ullrich
In particular, we derive a Hamilton-Jacobi-Bellman equation that governs the evolution of the log-densities of the underlying SDE marginals.
1 code implementation • 15 Jul 2021 • Daniel Severo, James Townsend, Ashish Khisti, Alireza Makhzani, Karen Ullrich
Current methods which compress multisets at an optimal rate have computational complexity that scales linearly with alphabet size, making them too slow to be practical in many real-world settings.
1 code implementation • NeurIPS 2021 • Yann Dubois, Benjamin Bloem-Reddy, Karen Ullrich, Chris J. Maddison
Most data is automatically collected and only ever "seen" by algorithms.
Ranked #1 on
Image Compression
on ImageNet
(using extra training data)
1 code implementation • ICLR Workshop Neural_Compression 2021 • Yangjun Ruan, Karen Ullrich, Daniel Severo, James Townsend, Ashish Khisti, Arnaud Doucet, Alireza Makhzani, Chris J. Maddison
Naively applied, our schemes would require more initial bits than the standard bits-back coder, but we show how to drastically reduce this additional cost with couplings in the latent space.
no code implementations • 30 Mar 2020 • Karen Ullrich, Fabio Viola, Danilo Jimenez Rezende
Reliably transmitting messages despite information loss due to a noisy channel is a core problem of information theory.
1 code implementation • 18 Jun 2019 • Karen Ullrich, Rianne van den Berg, Marcus Brubaker, David Fleet, Max Welling
Finally, we demonstrate how the reconstruction algorithm can be extended with an amortized inference scheme on unknown attributes such as object pose.
no code implementations • 17 Nov 2017 • Marco Federici, Karen Ullrich, Max Welling
Compression of Neural Networks (NN) has become a highly studied topic in recent years.
3 code implementations • 16 Jul 2017 • Eelco van der Wel, Karen Ullrich
This data set is the first publicly available set in OMR research with sufficient size to train and evaluate deep learning models.
3 code implementations • NeurIPS 2017 • Christos Louizos, Karen Ullrich, Max Welling
Compression and computational efficiency in deep learning have become a problem of great significance.
3 code implementations • 13 Feb 2017 • Karen Ullrich, Edward Meeds, Max Welling
The success of deep learning in numerous application domains created the de- sire to run and train them on mobile devices.