Learning Longer-term Dependencies in RNNs with Auxiliary Losses

ICML 2018 Trieu TrinhAndrew DaiThang LuongQuoc Le

Despite recent advances in training recurrent neural networks (RNNs), capturing long-term dependencies in sequences remains a fundamental challenge. Most approaches use backpropagation through time (BPTT), which is difficult to scale to very long sequences... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.