Towards Online End-to-end Transformer Automatic Speech Recognition

The Transformer self-attention network has recently shown promising performance as an alternative to recurrent neural networks in end-to-end (E2E) automatic speech recognition (ASR) systems. However, Transformer has a drawback in that the entire input sequence is required to compute self-attention... (read more)

Results in Papers With Code
(↓ scroll down to see all results)