1 code implementation • 1 Oct 2019 • Kyu J. Han, Ramon Prieto, Kaixing Wu, Tao Ma
Self-attention has been a huge success for many downstream tasks in NLP, which led to exploration of applying self-attention to speech problems as well.
Ranked #24 on Speech Recognition on LibriSpeech test-clean
1 code implementation • 31 Oct 2018 • Yan Yin, Ramon Prieto, Bin Wang, Jianwei Zhou, Yiwei Gu, Yang Liu, Hui Lin
Recent research has shown that attention-based sequence-to-sequence models such as Listen, Attend, and Spell (LAS) yield comparable results to state-of-the-art ASR systems on various tasks.