Skip to content

Approaching Rate-Distortion Limits in Neural Compression with Lattice Transform Coding

License

Notifications You must be signed in to change notification settings

leieric/lattice-transform-coding

Repository files navigation

lattice-transform-coding

This is the official code for the ICLR 2025 (spotlight) paper Approaching Rate-Distortion Limits in Neural Compression with Lattice Transform Coding.

Installation

In a python 3.11 environment, run

pip install -r requirements.txt

Data

The synthetic sources are generated in LTC/data.py. For the Physics source:

wget https://github.com/mandt-lab/RD-sandwich/raw/refs/heads/master/data/physics/ppzee-split=train.npy -P data/physics
wget https://github.com/mandt-lab/RD-sandwich/raw/refs/heads/master/data/physics/ppzee-split=test.npy -P data/physics

For the Speech source, follow this script to generate the stft-split=train.npy and stft-split=test.npy files. Put these in data/speech.

For large-scale image sources, we use the Vimeo-90k dataset for training, following the CompressAI library:

# Vimeo-90k, ~82GB
wget http://data.csail.mit.edu/tofu/dataset/vimeo_septuplet.zip -P data
cd data
unzip vimeo_septuplet.zip

For testing, the Kodak dataset is available here. Place all .png files in data/Kodak/1.

Running experiments

The implementation of Lattice Transform Coding (LTC) is contained in LTC:

The scripts to launch training and evaluations for all sources are contained in scripts/.

To visualize the quantizer regions generated by a 2-d model, see vis_quantizers.ipynb.

Citation

@inproceedings{
    lei2025approaching,
    title={Approaching Rate-Distortion Limits in Neural Compression with Lattice Transform Coding},
    author={Eric Lei and Hamed Hassani and Shirin Saeedi Bidokhti},
    booktitle={The Thirteenth International Conference on Learning Representations},
    year={2025},
    url={https://openreview.net/forum?id=Tv36j85SqR}
}

About

Approaching Rate-Distortion Limits in Neural Compression with Lattice Transform Coding

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published