Search Results for author: Lucas Theis

Found 27 papers, 10 papers with code

High-Fidelity Image Compression with Score-based Generative Models

no code implementations26 May 2023 Emiel Hoogeboom, Eirikur Agustsson, Fabian Mentzer, Luca Versari, George Toderici, Lucas Theis

Despite the tremendous success of diffusion generative models in text-to-image generation, replicating this success in the domain of image compression has proven difficult.

Image Compression Text-to-Image Generation

Lossy Compression with Gaussian Diffusion

no code implementations17 Jun 2022 Lucas Theis, Tim Salimans, Matthew D. Hoffman, Fabian Mentzer

Unlike modern compression schemes which rely on transform coding and quantization to restrict the transmitted information, DiffC relies on the efficient communication of pixels corrupted by Gaussian noise.


An Introduction to Neural Data Compression

1 code implementation14 Feb 2022 Yibo Yang, Stephan Mandt, Lucas Theis

Neural compression is the application of neural networks and other machine learning methods to data compression.

BIG-bench Machine Learning Data Compression +1

Optimal Compression of Locally Differentially Private Mechanisms

no code implementations29 Oct 2021 Abhin Shah, Wei-Ning Chen, Johannes Balle, Peter Kairouz, Lucas Theis

Compressing the output of \epsilon-locally differentially private (LDP) randomizers naively leads to suboptimal utility.

Algorithms for the Communication of Samples

no code implementations25 Oct 2021 Lucas Theis, Noureldin Yosri

The efficient communication of noisy data has applications in several areas of machine learning, such as neural compression or differential privacy, and is also known as reverse channel coding or the channel simulation problem.


A coding theorem for the rate-distortion-perception function

no code implementations ICLR Workshop Neural_Compression 2021 Lucas Theis, Aaron B. Wagner

The rate-distortion-perception function (RDPF; Blau and Michaeli, 2019) has emerged as a useful tool for thinking about realism and distortion of reconstructions in lossy compression.

Importance weighted compression

no code implementations ICLR Workshop Neural_Compression 2021 Lucas Theis, Jonathan Ho

The connection between variational autoencoders (VAEs) and compression is well established and they have been used for both lossless and lossy compression.

On the advantages of stochastic encoders

no code implementations ICLR Workshop Neural_Compression 2021 Lucas Theis, Eirikur Agustsson

Stochastic encoders have been used in rate-distortion theory and neural compression because they can be easier to handle.

Universally Quantized Neural Compression

no code implementations NeurIPS 2020 Eirikur Agustsson, Lucas Theis

A popular approach to learning encoders for lossy compression is to use additive uniform noise during training as a differentiable approximation to test-time quantization.


Discriminative Topic Modeling with Logistic LDA

1 code implementation NeurIPS 2019 Iryna Korshunova, Hanchen Xiong, Mateusz Fedoryszak, Lucas Theis

We propose logistic LDA, a novel discriminative variant of latent Dirichlet allocation which is easy to apply to arbitrary inputs.

Topic Models

Addressing Delayed Feedback for Continuous Training with Neural Networks in CTR prediction

no code implementations15 Jul 2019 Sofia Ira Ktena, Alykhan Tejani, Lucas Theis, Pranay Kumar Myana, Deepak Dilipkumar, Ferenc Huszar, Steven Yoo, Wenzhe Shi

The focus of this paper is to identify the best combination of loss functions and models that enable large-scale learning from a continuous stream of data in the presence of delayed labels.

Click-Through Rate Prediction

HoloGAN: Unsupervised learning of 3D representations from natural images

3 code implementations ICCV 2019 Thu Nguyen-Phuoc, Chuan Li, Lucas Theis, Christian Richardt, Yong-Liang Yang

This shows that HoloGAN is the first generative model that learns 3D representations from natural images in an entirely unsupervised manner.

Image Generation Novel View Synthesis

Faster gaze prediction with dense networks and Fisher pruning

2 code implementations Twitter 2018 Lucas Theis, Iryna Korshunova, Alykhan Tejani, Ferenc Huszár

Predicting human fixations from images has recently seen large improvements by leveraging deep representations which were pretrained for object recognition.

Gaze Estimation Gaze Prediction +3

Checkerboard artifact free sub-pixel convolution: A note on sub-pixel convolution, resize convolution and convolution resize

3 code implementations10 Jul 2017 Andrew Aitken, Christian Ledig, Lucas Theis, Jose Caballero, Zehan Wang, Wenzhe Shi

Compared to sub-pixel convolution initialized with schemes designed for standard convolution kernels, it is free from checkerboard artifacts immediately after initialization.

Lossy Image Compression with Compressive Autoencoders

4 code implementations1 Mar 2017 Lucas Theis, Wenzhe Shi, Andrew Cunningham, Ferenc Huszár

We propose a new approach to the problem of optimizing autoencoders for lossy image compression.

Image Compression

Fast Face-swap Using Convolutional Neural Networks

no code implementations ICCV 2017 Iryna Korshunova, Wenzhe Shi, Joni Dambre, Lucas Theis

We consider the problem of face swapping in images, where an input identity is transformed into a target identity while preserving pose, facial expression, and lighting.

Face Swapping Style Transfer

Amortised MAP Inference for Image Super-resolution

no code implementations14 Oct 2016 Casper Kaae Sønderby, Jose Caballero, Lucas Theis, Wenzhe Shi, Ferenc Huszár

We show that, using this architecture, the amortised MAP inference problem reduces to minimising the cross-entropy between two distributions, similar to training generative models.

Denoising Image Super-Resolution +1

Is the deconvolution layer the same as a convolutional layer?

6 code implementations22 Sep 2016 Wenzhe Shi, Jose Caballero, Lucas Theis, Ferenc Huszar, Andrew Aitken, Christian Ledig, Zehan Wang

In this note, we want to focus on aspects related to two questions most people asked us at CVPR about the network we presented.

Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network

133 code implementations CVPR 2017 Christian Ledig, Lucas Theis, Ferenc Huszar, Jose Caballero, Andrew Cunningham, Alejandro Acosta, Andrew Aitken, Alykhan Tejani, Johannes Totz, Zehan Wang, Wenzhe Shi

The adversarial loss pushes our solution to the natural image manifold using a discriminator network that is trained to differentiate between the super-resolved images and original photo-realistic images.

Image Super-Resolution

A note on the evaluation of generative models

1 code implementation5 Nov 2015 Lucas Theis, Aäron van den Oord, Matthias Bethge

In particular, we show that three of the currently most commonly used criteria---average log-likelihood, Parzen window estimates, and visual fidelity of samples---are largely independent of each other when the data is high-dimensional.

Denoising Texture Synthesis

Generative Image Modeling Using Spatial LSTMs

no code implementations NeurIPS 2015 Lucas Theis, Matthias Bethge

Modeling the distribution of natural images is challenging, partly because of strong statistical dependencies which can extend over hundreds of pixels.

Ranked #54 on Image Generation on CIFAR-10 (bits/dimension metric)

Image Generation Texture Synthesis

A trust-region method for stochastic variational inference with applications to streaming data

no code implementations28 May 2015 Lucas Theis, Matthew D. Hoffman

However, the algorithm is prone to local optima which can make the quality of the posterior approximation sensitive to the choice of hyperparameters and initialization.

Variational Inference

A Generative Model of Natural Texture Surrogates

no code implementations28 May 2015 Niklas Ludtke, Debapriya Das, Lucas Theis, Matthias Bethge

In order to model this variability, we first applied the parametric texture algorithm of Portilla and Simoncelli to image patches of 64X64 pixels in a large database of natural images such that each image patch is then described by 655 texture parameters which specify certain statistics, such as variances and covariances of wavelet coefficients or coefficient magnitudes within that patch.

Image Compression

Supervised learning sets benchmark for robust spike detection from calcium imaging signals

no code implementations28 Feb 2015 Lucas Theis, Philipp Berens, Emmanouil Froudarakis, Jacob Reimer, Miroslav Román Rosón, Tom Baden, Thomas Euler, Andreas Tolias, Matthias Bethge

A fundamental challenge in calcium imaging has been to infer the timing of action potentials from the measured noisy calcium fluorescence traces.

Deep Gaze I: Boosting Saliency Prediction with Feature Maps Trained on ImageNet

1 code implementation4 Nov 2014 Matthias Kümmerer, Lucas Theis, Matthias Bethge

Recent results suggest that state-of-the-art saliency models perform far from optimal in predicting fixations.

Object Recognition Point Processes +1

Training sparse natural image models with a fast Gibbs sampler of an extended state space

no code implementations NeurIPS 2012 Lucas Theis, Jascha Sohl-Dickstein, Matthias Bethge

We present a new learning strategy based on an efficient blocked Gibbs sampler for sparse overcomplete linear models.

Cannot find the paper you are looking for? You can Submit a new open access paper.