no code implementations • 3 Oct 2024 • Mikhail Persiianov, Arip Asadulaev, Nikita Andreev, Nikita Starodubcev, Dmitry Baranchuk, Anastasis Kratsios, Evgeny Burnaev, Alexander Korotin
To tackle this issue, we propose a new learning paradigm that integrates both paired and unpaired data $\textbf{seamlessly}$ through the data likelihood maximization techniques.
no code implementations • 20 Jun 2024 • Nikita Starodubcev, Mikhail Khoroshikh, Artem Babenko, Dmitry Baranchuk
Diffusion distillation represents a highly promising direction for achieving faithful text-to-image generation in a few sampling steps.
1 code implementation • CVPR 2024 • Nikita Starodubcev, Artem Fedorov, Artem Babenko, Dmitry Baranchuk
While several powerful distillation methods were recently proposed, the overall quality of student samples is typically lower compared to the teacher ones, which hinders their practical usage.
1 code implementation • 10 Apr 2023 • Nikita Starodubcev, Dmitry Baranchuk, Valentin Khrulkov, Artem Babenko
Finally, we show that our approach can adapt the pretrained model to the user-specified image and text description on the fly just for 4 seconds.