Multiplicative LSTM for sequence modelling

26 Sep 2016 Ben Krause Liang Lu Iain Murray Steve Renals

We introduce multiplicative LSTM (mLSTM), a recurrent neural network architecture for sequence modelling that combines the long short-term memory (LSTM) and multiplicative recurrent neural network architectures. mLSTM is characterised by its ability to have different recurrent transition functions for each possible input, which we argue makes it more expressive for autoregressive density estimation... (read more)

PDF Abstract

Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK RESULT BENCHMARK
Language Modelling enwik8 Large mLSTM Bit per Character (BPC) 1.24 # 20
Number of params 46M # 15
Language Modelling Hutter Prize Large mLSTM +emb +WN +VD Bit per Character (BPC) 1.24 # 7
Number of params 46M # 5
Language Modelling Text8 Unregularised mLSTM Bit per Character (BPC) 1.40 # 14
Number of params 45M # 7
Language Modelling Text8 Large mLSTM +emb +WN +VD Bit per Character (BPC) 1.27 # 11
Number of params 45M # 7

Methods used in the Paper