Parallel Neural Local Lossless Compression

13 Jan 2022  ·  Mingtian Zhang, James Townsend, Ning Kang, David Barber ·

The recently proposed Neural Local Lossless Compression (NeLLoC), which is based on a local autoregressive model, has achieved state-of-the-art (SOTA) out-of-distribution (OOD) generalization performance in the image compression task. In addition to the encouragement of OOD generalization, the local model also allows parallel inference in the decoding stage. In this paper, we propose two parallelization schemes for local autoregressive models. We discuss the practicalities of implementing the schemes and provide experimental evidence of significant gains in compression runtime compared to the previous, non-parallel implementation.

PDF Abstract


  Add Datasets introduced or used in this paper

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.


No methods listed for this paper. Add relevant methods here