Search Results for author: Yurii Kuratov

Found 2 papers, 1 papers with code

Memory Representation in Transformer

no code implementations1 Jan 2021 Mikhail Burtsev, Yurii Kuratov, Anton Peganov, Grigory V. Sapunov

Adding trainable memory to selectively store local as well as global representations of a sequence is a promising direction to improve the Transformer model.

Language Modelling Machine Translation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.