no code implementations • 15 Aug 2024 • Lukas Stankevičius, Mantas Lukoševičius
We also evaluate our representation-shaping techniques on other static models, including random token representations.
no code implementations • 29 Jul 2024 • Domas Grigaliūnas, Mantas Lukoševičius
It is a modified assisted generation methodology that makes use of a smaller model's fast generation, large model's batch prediction, and "stairs" validation in order to achieve a speed up in prediction generation.
no code implementations • 29 Jul 2024 • Brigita Vileikytė, Mantas Lukoševičius, Lukas Stankevičius
Despite this, the task remains challenging because of the inherent complexity of languages and the subjective nature of sentiments.
2 code implementations • 24 May 2024 • Bohdan Petryshyn, Mantas Lukoševičius
This study evaluates the OpenAPI completion performance of GitHub Copilot, a prevalent commercial code completion tool, and proposes a set of task-specific optimizations leveraging Meta's open-source model Code Llama.
1 code implementation • 11 Apr 2022 • Mantas Lukoševičius, Arnas Uselis
We propose an elegant straightforward alternative approach where instead the RNN is in effect resampled in time to match the time of the data or the task at hand.
1 code implementation • 18 Mar 2022 • Lukas Stankevičius, Mantas Lukoševičius
Everyone wants to write beautiful and correct text, yet the lack of language skills, experience, or hasty typing can result in errors.
no code implementations • 31 Jan 2022 • Lukas Stankevičius, Mantas Lukoševičius, Jurgita Kapočiūtė-Dzikienė, Monika Briedienė, Tomas Krilavičius
Our approach is also able to restore diacritics in words not seen during training with > 76% accuracy.
no code implementations • 5 Jul 2021 • Rokas Pečiulis, Mantas Lukoševičius, Algimantas Kriščiukaitis, Robertas Petrolis, Dovilė Buteikienė
This work aims to research an automatic method for detecting Age-related Macular Degeneration (AMD) lesions in RGB eye fundus images.
1 code implementation • 23 Apr 2021 • Lukas Stankevičius, Mantas Lukoševičius
In this work, we train the first monolingual Lithuanian transformer model on a relatively large corpus of Lithuanian news articles and compare various output decoding algorithms for abstractive news summarization.
1 code implementation • 19 Jun 2020 • Mantas Lukoševičius, Arnas Uselis
The second level of optimization also makes the (ii) part remain constant irrespective of large $k$, as long as the dimension of the output is low.
1 code implementation • 12 May 2020 • Arnas Uselis, Mantas Lukoševičius, Lukas Stasytis
They can be added to any convolutional layers, easily end-to-end trained, introduce minimal additional complexity, and let CNNs retain most of their benefits to the extent that they are needed.
no code implementations • 3 Apr 2020 • Lukas Stankevičius, Mantas Lukoševičius
A recent introduction of Transformer deep learning architecture made breakthroughs in various natural language processing tasks.
1 code implementation • 22 Aug 2019 • Mantas Lukoševičius, Arnas Uselis
Thus in many situations $k$-fold cross-validation of ESNs can be done for virtually the same time complexity as a simple single split validation.