no code implementations • WS 2018 • Avery Hiebert, Cole Peterson, Alona Fyshe, Nishant Mehta
While Long Short-Term Memory networks (LSTMs) and other forms of recurrent neural network have been successfully applied to language modeling on a character level, the hidden state dynamics of these models can be difficult to interpret.