Search Results for author: Anton Voronov

Found 6 papers, 5 papers with code

Pixel-Level BPE for Auto-Regressive Image Generation

no code implementations MMMPIE (COLING) 2022 Anton Razzhigaev, Anton Voronov, Andrey Kaznacheev, Andrey Kuznetsov, Denis Dimitrov, Alexander Panchenko

Pixel-level autoregression with Transformer models (Image GPT or iGPT) is one of the recent approaches to image generation that has not received massive attention and elaboration due to quadratic complexity of attention as it imposes huge memory requirements and thus restricts the resolution of the generated images.

Image Generation

Mind Your Format: Towards Consistent Evaluation of In-Context Learning Improvements

1 code implementation12 Jan 2024 Anton Voronov, Lena Wolf, Max Ryabinin

Large language models demonstrate a remarkable capability for learning to solve new tasks from a few examples.

In-Context Learning

Is This Loss Informative? Faster Text-to-Image Customization by Tracking Objective Dynamics

1 code implementation NeurIPS 2023 Anton Voronov, Mikhail Khoroshikh, Artem Babenko, Max Ryabinin

Text-to-image generation models represent the next step of evolution in image synthesis, offering a natural way to achieve flexible yet fine-grained control over the result.

Text-to-Image Generation

Many Heads but One Brain: Fusion Brain -- a Competition and a Single Multimodal Multitask Architecture

1 code implementation22 Nov 2021 Daria Bakshandaeva, Denis Dimitrov, Vladimir Arkhipkin, Alex Shonenkov, Mark Potanin, Denis Karachev, Andrey Kuznetsov, Anton Voronov, Vera Davydova, Elena Tutubalina, Aleksandr Petiushko

Supporting the current trend in the AI community, we present the AI Journey 2021 Challenge called Fusion Brain, the first competition which is targeted to make the universal architecture which could process different modalities (in this case, images, texts, and code) and solve multiple tasks for vision and language.

Handwritten Text Recognition object-detection +4

Cannot find the paper you are looking for? You can Submit a new open access paper.