no code implementations • 9 Feb 2022 • Edo Cohen-Karlik, Avichai Ben David, Nadav Cohen, Amir Globerson
When using recurrent neural networks (RNNs) it is common practice to apply trained models to sequences longer than those seen in training.
no code implementations • NeurIPS 2020 • Edo Cohen-Karlik, Avichai Ben David, Amir Globerson
We show that RNNs can be regularized towards permutation invariance, and that this can result in compact models, as compared to non-recurrent architectures.