Search Results for author: Anton Peganov

Found 2 papers, 1 papers with code

Memory Representation in Transformer

no code implementations1 Jan 2021 Mikhail Burtsev, Yurii Kuratov, Anton Peganov, Grigory V. Sapunov

Adding trainable memory to selectively store local as well as global representations of a sequence is a promising direction to improve the Transformer model.

Language Modelling Machine Translation +2

Memory Transformer

1 code implementation20 Jun 2020 Mikhail S. Burtsev, Yuri Kuratov, Anton Peganov, Grigory V. Sapunov

Adding trainable memory to selectively store local as well as global representations of a sequence is a promising direction to improve the Transformer model.

Language Modelling Machine Translation +5

Cannot find the paper you are looking for? You can Submit a new open access paper.