no code implementations • 11 Apr 2022 • Mantas Lukoševičius, Arnas Uselis
We propose an elegant alternative approach where instead the RNN is in effect resampled in time to match the time of the data.
1 code implementation • 19 Jun 2020 • Mantas Lukoševičius, Arnas Uselis
The second level of optimization also makes the (ii) part remain constant irrespective of large $k$, as long as the dimension of the output is low.
1 code implementation • 12 May 2020 • Arnas Uselis, Mantas Lukoševičius, Lukas Stasytis
They can be added to any convolutional layers, easily end-to-end trained, introduce minimal additional complexity, and let CNNs retain most of their benefits to the extent that they are needed.
1 code implementation • 22 Aug 2019 • Mantas Lukoševičius, Arnas Uselis
Thus in many situations $k$-fold cross-validation of ESNs can be done for virtually the same time complexity as a simple single split validation.