no code implementations • 26 Jan 2025 • Alexey Rukhovich, Alexander Podolskiy, Irina Piontkovskaya
In multi-domain learning, a single model is trained on diverse data domains to leverage shared knowledge and improve generalization.
no code implementations • 20 Nov 2023 • Andrey Bout, Alexander Podolskiy, Sergey Nikolenko, Irina Piontkovskaya
Progress in neural grammatical error correction (GEC) is hindered by the lack of annotated training data.
1 code implementation • 14 Nov 2023 • Konstantin Yakovlev, Alexander Podolskiy, Andrey Bout, Sergey Nikolenko, Irina Piontkovskaya
Grammatical error correction (GEC) is an important NLP task that is currently usually solved with autoregressive sequence-to-sequence models.
no code implementations • 14 Nov 2023 • Konstantin Yakovlev, Gregory Polyakov, Ilseyar Alimova, Alexander Podolskiy, Andrey Bout, Sergey Nikolenko, Irina Piontkovskaya
A recent trend in multimodal retrieval is related to postprocessing test set results via the dual-softmax loss (DSL).
no code implementations • 20 Mar 2023 • Xiaozhe Ren, Pingyi Zhou, Xinfan Meng, Xinjing Huang, Yadao Wang, Weichao Wang, Pengfei Li, Xiaoda Zhang, Alexander Podolskiy, Grigory Arshinov, Andrey Bout, Irina Piontkovskaya, Jiansheng Wei, Xin Jiang, Teng Su, Qun Liu, Jun Yao
In this work, we develop a system that trained a trillion-parameter language model on a cluster of Ascend 910 AI processors and MindSpore framework, and present the language model with 1. 085T parameters named PanGu-{\Sigma}.
1 code implementation • COLING 2020 • Nikolay Arefyev, Boris Sheludko, Alexander Podolskiy, Alexander Panchenko
Lexical substitution, i. e. generation of plausible words that can replace a particular target word in a given context, is an extremely powerful technology that can be used as a backbone of various NLP applications, including word sense induction and disambiguation, lexical relation extraction, data augmentation, etc.
1 code implementation • 11 Jan 2021 • Alexander Podolskiy, Dmitry Lipin, Andrey Bout, Ekaterina Artemova, Irina Piontkovskaya
In turn, the Mahalanobis distance captures this disparity easily.
no code implementations • 29 May 2020 • Nikolay Arefyev, Boris Sheludko, Alexander Podolskiy, Alexander Panchenko
Lexical substitution in context is an extremely powerful technology that can be used as a backbone of various NLP applications, such as word sense induction, lexical relation extraction, data augmentation, etc.