Paper

Higher Order Recurrent Neural Networks

In this paper, we study novel neural network structures to better model long term dependency in sequential data. We propose to use more memory units to keep track of more preceding states in recurrent neural networks (RNNs), which are all recurrently fed to the hidden layers as feedback through different weighted paths... (read more)

Results in Papers With Code
(↓ scroll down to see all results)