no code implementations • 15 Aug 2024 • Lukas Stankevičius, Mantas Lukoševičius
We also evaluate our representation-shaping techniques on other static models, including random token representations.
no code implementations • 29 Jul 2024 • Brigita Vileikytė, Mantas Lukoševičius, Lukas Stankevičius
Despite this, the task remains challenging because of the inherent complexity of languages and the subjective nature of sentiments.
1 code implementation • 18 Mar 2022 • Lukas Stankevičius, Mantas Lukoševičius
Everyone wants to write beautiful and correct text, yet the lack of language skills, experience, or hasty typing can result in errors.
no code implementations • 31 Jan 2022 • Lukas Stankevičius, Mantas Lukoševičius, Jurgita Kapočiūtė-Dzikienė, Monika Briedienė, Tomas Krilavičius
Our approach is also able to restore diacritics in words not seen during training with > 76% accuracy.
1 code implementation • 23 Apr 2021 • Lukas Stankevičius, Mantas Lukoševičius
In this work, we train the first monolingual Lithuanian transformer model on a relatively large corpus of Lithuanian news articles and compare various output decoding algorithms for abstractive news summarization.
no code implementations • 3 Apr 2020 • Lukas Stankevičius, Mantas Lukoševičius
A recent introduction of Transformer deep learning architecture made breakthroughs in various natural language processing tasks.