Addressing Some Limitations of Transformers with Feedback Memory

Transformers have been successfully applied to sequential, auto-regressive tasks despite being feedforward networks. Unlike recurrent neural networks, Transformers use attention to capture temporal relations while processing input tokens in parallel... (read more)

PDF Abstract PDF Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Language Modelling enwik8 Feedback Transformer Bit per Character (BPC) 0.96 # 3
Number of params 77M # 9
Language Modelling Penn Treebank (Character Level) Feedback Transformer Bit per Character (BPC) 1.160 # 3
Number of params 10.7M # 7
Language Modelling WikiText-103 Feedback Transformer (4 layers) Validation perplexity 21.4 # 13
Test perplexity 22.4 # 20
Number of params 44M # 17
Language Modelling WikiText-103 Feedback Transformer (8 layers) Validation perplexity 17.5 # 7
Test perplexity 18.2 # 12
Number of params 139M # 10

Methods used in the Paper