no code implementations • 11 Apr 2022 • Mantas Lukoševičius, Arnas Uselis
We propose an elegant alternative approach where instead the RNN is in effect resampled in time to match the time of the data.
1 code implementation • 18 Mar 2022 • Lukas Stankevičius, Mantas Lukoševičius
Everyone wants to write beautiful and correct text, yet the lack of language skills, experience, or hasty typing can result in errors.
no code implementations • 31 Jan 2022 • Lukas Stankevičius, Mantas Lukoševičius, Jurgita Kapočiūtė-Dzikienė, Monika Briedienė, Tomas Krilavičius
Our approach is also able to restore diacritics in words not seen during training with > 76% accuracy.
no code implementations • 5 Jul 2021 • Rokas Pečiulis, Mantas Lukoševičius, Algimantas Kriščiukaitis, Robertas Petrolis, Dovilė Buteikienė
This work aims to research an automatic method for detecting Age-related Macular Degeneration (AMD) lesions in RGB eye fundus images.
1 code implementation • 23 Apr 2021 • Lukas Stankevičius, Mantas Lukoševičius
In this work, we train the first monolingual Lithuanian transformer model on a relatively large corpus of Lithuanian news articles and compare various output decoding algorithms for abstractive news summarization.
1 code implementation • 19 Jun 2020 • Mantas Lukoševičius, Arnas Uselis
The second level of optimization also makes the (ii) part remain constant irrespective of large $k$, as long as the dimension of the output is low.
1 code implementation • 12 May 2020 • Arnas Uselis, Mantas Lukoševičius, Lukas Stasytis
They can be added to any convolutional layers, easily end-to-end trained, introduce minimal additional complexity, and let CNNs retain most of their benefits to the extent that they are needed.
no code implementations • 3 Apr 2020 • Lukas Stankevičius, Mantas Lukoševičius
A recent introduction of Transformer deep learning architecture made breakthroughs in various natural language processing tasks.
1 code implementation • 22 Aug 2019 • Mantas Lukoševičius, Arnas Uselis
Thus in many situations $k$-fold cross-validation of ESNs can be done for virtually the same time complexity as a simple single split validation.