Paper

Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems

We propose a simplified model of attention which is applicable to feed-forward neural networks and demonstrate that the resulting model can solve the synthetic "addition" and "multiplication" long-term memory problems for sequence lengths which are both longer and more widely varying than the best published results for these tasks.

Results in Papers With Code
(↓ scroll down to see all results)