Search Results for author: Vage Egiazarian

Found 13 papers, 11 papers with code

Cache Me If You Must: Adaptive Key-Value Quantization for Large Language Models

1 code implementation31 Jan 2025 Alina Shutova, Vladimir Malinovskii, Vage Egiazarian, Denis Kuznedelev, Denis Mazur, Nikita Surkov, Ivan Ermakov, Dan Alistarh

Efficient real-world deployments of large language models (LLMs) rely on Key-Value (KV) caching for processing and generating long outputs, reducing the need for repetitive computation.

Quantization

Accurate Compression of Text-to-Image Diffusion Models via Vector Quantization

no code implementations31 Aug 2024 Vage Egiazarian, Denis Kuznedelev, Anton Voronov, Ruslan Svirschevski, Michael Goin, Daniil Pavlov, Dan Alistarh, Dmitry Baranchuk

Specifically, we tailor vector-based PTQ methods to recent billion-scale text-to-image models (SDXL and SDXL-Turbo), and show that the diffusion models of 2B+ parameters compressed to around 3 bits using VQ exhibit the similar image quality and textual alignment as previous 4-bit compression techniques.

Image Generation Quantization

Extreme Compression of Large Language Models via Additive Quantization

1 code implementation11 Jan 2024 Vage Egiazarian, Andrei Panferov, Denis Kuznedelev, Elias Frantar, Artem Babenko, Dan Alistarh

The emergence of accurate open large language models (LLMs) has led to a race towards performant quantization techniques which can enable their execution on end-user devices.

Information Retrieval Quantization

Neural Optimal Transport with General Cost Functionals

1 code implementation30 May 2022 Arip Asadulaev, Alexander Korotin, Vage Egiazarian, Petr Mokrov, Evgeny Burnaev

We introduce a novel neural network-based algorithm to compute optimal transport (OT) plans for general cost functionals.

Wasserstein Iterative Networks for Barycenter Estimation

1 code implementation28 Jan 2022 Alexander Korotin, Vage Egiazarian, Lingxiao Li, Evgeny Burnaev

Wasserstein barycenters have become popular due to their ability to represent the average of probability measures in a geometrically meaningful way.

DEF: Deep Estimation of Sharp Geometric Features in 3D Shapes

1 code implementation30 Nov 2020 Albert Matveev, Ruslan Rakhimov, Alexey Artemov, Gleb Bobrovskikh, Vage Egiazarian, Emil Bogomolov, Daniele Panozzo, Denis Zorin, Evgeny Burnaev

We propose Deep Estimators of Features (DEFs), a learning-based framework for predicting sharp geometric features in sampled 3D shapes.

Deep Vectorization of Technical Drawings

1 code implementation ECCV 2020 Vage Egiazarian, Oleg Voynov, Alexey Artemov, Denis Volkhonskiy, Aleksandr Safin, Maria Taktasheva, Denis Zorin, Evgeny Burnaev

We present a new method for vectorization of technical line drawings, such as floor plans, architectural drawings, and 2D CAD images.

Latent-Space Laplacian Pyramids for Adversarial Representation Learning with 3D Point Clouds

1 code implementation13 Dec 2019 Vage Egiazarian, Savva Ignatyev, Alexey Artemov, Oleg Voynov, Andrey Kravchenko, Youyi Zheng, Luiz Velho, Evgeny Burnaev

Constructing high-quality generative models for 3D shapes is a fundamental task in computer vision with diverse applications in geometry processing, engineering, and design.

Generating 3D Point Clouds Representation Learning

Wasserstein-2 Generative Networks

4 code implementations ICLR 2021 Alexander Korotin, Vage Egiazarian, Arip Asadulaev, Alexander Safin, Evgeny Burnaev

We propose a novel end-to-end non-minimax algorithm for training optimal transport mappings for the quadratic cost (Wasserstein-2 distance).

Domain Adaptation Style Transfer

Perceptual deep depth super-resolution

1 code implementation ICCV 2019 Oleg Voynov, Alexey Artemov, Vage Egiazarian, Alexander Notchenko, Gleb Bobrovskikh, Denis Zorin, Evgeny Burnaev

RGBD images, combining high-resolution color and lower-resolution depth from various types of depth sensors, are increasingly common.

Super-Resolution

Cannot find the paper you are looking for? You can Submit a new open access paper.