Search Results for author: Aaron B. Wagner

Found 7 papers, 1 papers with code

The Rate-Distortion-Perception Trade-off: The Role of Private Randomness

no code implementations1 Apr 2024 Yassine Hamdi, Aaron B. Wagner, Deniz Gündüz

The per-symbol near-perfect realism constraint requires that the TVD between the distribution of output symbol $Y_t$ and the source distribution be arbitrarily small, uniformly in the index $t.$ We characterize the corresponding asymptotic rate-distortion trade-off and show that encoder private randomness is not useful if the compression rate is lower than the entropy of the source, however limited the resources in terms of common randomness and decoder private randomness may be.

Image Compression

Wasserstein Distortion: Unifying Fidelity and Realism

no code implementations5 Oct 2023 Yang Qiu, Aaron B. Wagner, Johannes Ballé, Lucas Theis

We introduce a distortion measure for images, Wasserstein distortion, that simultaneously generalizes pixel-level fidelity on the one hand and realism or perceptual quality on the other.

Texture Synthesis

Do Neural Networks Compress Manifolds Optimally?

no code implementations17 May 2022 Sourbh Bhadane, Aaron B. Wagner, Johannes Ballé

Artificial Neural-Network-based (ANN-based) lossy compressors have recently obtained striking results on several sources.

On One-Bit Quantization

no code implementations10 Feb 2022 Sourbh Bhadane, Aaron B. Wagner

We consider the one-bit quantizer that minimizes the mean squared error for a source living in a real Hilbert space.

Quantization

The Rate-Distortion-Perception Tradeoff: The Role of Common Randomness

no code implementations8 Feb 2022 Aaron B. Wagner

A rate-distortion-perception (RDP) tradeoff has recently been proposed by Blau and Michaeli and also Matsumoto.

Principal Bit Analysis: Autoencoding with Schur-Concave Loss

1 code implementation5 Jun 2021 Sourbh Bhadane, Aaron B. Wagner, Jayadev Acharya

As one application, we consider a strictly Schur-concave constraint that estimates the number of bits needed to represent the latent variables under fixed-rate encoding, a setup that we call \emph{Principal Bit Analysis (PBA)}.

A coding theorem for the rate-distortion-perception function

no code implementations ICLR Workshop Neural_Compression 2021 Lucas Theis, Aaron B. Wagner

The rate-distortion-perception function (RDPF; Blau and Michaeli, 2019) has emerged as a useful tool for thinking about realism and distortion of reconstructions in lossy compression.

Cannot find the paper you are looking for? You can Submit a new open access paper.