Search Results for author: Aydar Bulatov

Found 4 papers, 4 papers with code

Recurrent Memory Transformer

3 code implementations14 Jul 2022 Aydar Bulatov, Yuri Kuratov, Mikhail S. Burtsev

We implement a memory mechanism with no changes to Transformer model by adding special memory tokens to the input or output sequence.

Language Modelling

Scaling Transformer to 1M tokens and beyond with RMT

3 code implementations19 Apr 2023 Aydar Bulatov, Yuri Kuratov, Yermek Kapushev, Mikhail S. Burtsev

A major limitation for the broader scope of problems solvable by transformers is the quadratic scaling of computational complexity with input size.

Language Modelling Natural Language Understanding +1

In Search of Needles in a 11M Haystack: Recurrent Memory Finds What LLMs Miss

2 code implementations16 Feb 2024 Yuri Kuratov, Aydar Bulatov, Petr Anokhin, Dmitry Sorokin, Artyom Sorokin, Mikhail Burtsev

This paper addresses the challenge of processing long documents using generative transformer models.

Better Together: Enhancing Generative Knowledge Graph Completion with Language Models and Neighborhood Information

1 code implementation2 Nov 2023 Alla Chepurova, Aydar Bulatov, Yuri Kuratov, Mikhail Burtsev

In this study, we propose to include node neighborhoods as additional information to improve KGC methods based on language models.

Imputation World Knowledge

Cannot find the paper you are looking for? You can Submit a new open access paper.