Residual LSTM: Design of a Deep Recurrent Architecture for Distant Speech Recognition

10 Jan 2017 Jaeyoung Kim Mostafa El-Khamy Jungwon Lee

In this paper, a novel architecture for a deep recurrent neural network, residual LSTM is introduced. A plain LSTM has an internal memory cell that can learn long term dependencies of sequential data... (read more)

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper


METHOD TYPE
Sigmoid Activation
Activation Functions
Tanh Activation
Activation Functions
LSTM
Recurrent Neural Networks