no code implementations • 5 Nov 2022 • Junqi Shi, Ming Lu, Zhan Ma
Quantizing a floating-point neural network to its fixed-point representation is crucial for Learned Image Compression (LIC) because it improves decoding consistency for interoperability and reduces space-time complexity for implementation.