1 code implementation • 31 Jan 2025 • Alina Shutova, Vladimir Malinovskii, Vage Egiazarian, Denis Kuznedelev, Denis Mazur, Nikita Surkov, Ivan Ermakov, Dan Alistarh
Efficient real-world deployments of large language models (LLMs) rely on Key-Value (KV) caching for processing and generating long outputs, reducing the need for repetitive computation.
no code implementations • 17 Oct 2024 • Arip Asadulaev, Rostislav Korst, Alexander Korotin, Vage Egiazarian, Andrey Filchenkov, Evgeny Burnaev
We propose a novel algorithm for offline reinforcement learning using optimal transport.
no code implementations • 31 Aug 2024 • Vage Egiazarian, Denis Kuznedelev, Anton Voronov, Ruslan Svirschevski, Michael Goin, Daniil Pavlov, Dan Alistarh, Dmitry Baranchuk
Specifically, we tailor vector-based PTQ methods to recent billion-scale text-to-image models (SDXL and SDXL-Turbo), and show that the diffusion models of 2B+ parameters compressed to around 3 bits using VQ exhibit the similar image quality and textual alignment as previous 4-bit compression techniques.
1 code implementation • 11 Jan 2024 • Vage Egiazarian, Andrei Panferov, Denis Kuznedelev, Elias Frantar, Artem Babenko, Dan Alistarh
The emergence of accurate open large language models (LLMs) has led to a race towards performant quantization techniques which can enable their execution on end-user devices.
1 code implementation • 5 Jun 2023 • Tim Dettmers, Ruslan Svirschevski, Vage Egiazarian, Denis Kuznedelev, Elias Frantar, Saleh Ashkboos, Alexander Borzunov, Torsten Hoefler, Dan Alistarh
Recent advances in large language model (LLM) pretraining have led to high-quality LLMs with impressive abilities.
1 code implementation • 30 May 2022 • Arip Asadulaev, Alexander Korotin, Vage Egiazarian, Petr Mokrov, Evgeny Burnaev
We introduce a novel neural network-based algorithm to compute optimal transport (OT) plans for general cost functionals.
1 code implementation • 28 Jan 2022 • Alexander Korotin, Vage Egiazarian, Lingxiao Li, Evgeny Burnaev
Wasserstein barycenters have become popular due to their ability to represent the average of probability measures in a geometrically meaningful way.
1 code implementation • 30 Nov 2020 • Albert Matveev, Ruslan Rakhimov, Alexey Artemov, Gleb Bobrovskikh, Vage Egiazarian, Emil Bogomolov, Daniele Panozzo, Denis Zorin, Evgeny Burnaev
We propose Deep Estimators of Features (DEFs), a learning-based framework for predicting sharp geometric features in sampled 3D shapes.
1 code implementation • ECCV 2020 • Vage Egiazarian, Oleg Voynov, Alexey Artemov, Denis Volkhonskiy, Aleksandr Safin, Maria Taktasheva, Denis Zorin, Evgeny Burnaev
We present a new method for vectorization of technical line drawings, such as floor plans, architectural drawings, and 2D CAD images.
1 code implementation • 13 Dec 2019 • Vage Egiazarian, Savva Ignatyev, Alexey Artemov, Oleg Voynov, Andrey Kravchenko, Youyi Zheng, Luiz Velho, Evgeny Burnaev
Constructing high-quality generative models for 3D shapes is a fundamental task in computer vision with diverse applications in geometry processing, engineering, and design.
1 code implementation • NeurIPS 2019 • Denis Mazur, Vage Egiazarian, Stanislav Morozov, Artem Babenko
Our main contribution is PRODIGE: a method that learns a weighted graph representation of data end-to-end by gradient descent.
4 code implementations • ICLR 2021 • Alexander Korotin, Vage Egiazarian, Arip Asadulaev, Alexander Safin, Evgeny Burnaev
We propose a novel end-to-end non-minimax algorithm for training optimal transport mappings for the quadratic cost (Wasserstein-2 distance).
1 code implementation • ICCV 2019 • Oleg Voynov, Alexey Artemov, Vage Egiazarian, Alexander Notchenko, Gleb Bobrovskikh, Denis Zorin, Evgeny Burnaev
RGBD images, combining high-resolution color and lower-resolution depth from various types of depth sensors, are increasingly common.