Inspired by these learning patterns in humans, we suggest a simple yet generic task aware framework to incorporate into existing joint learning strategies.
In this paper, we describe end-to-end simultaneous speech-to-text and text-to-text translation systems submitted to IWSLT2020 online translation challenge.
In this paper, we describe the system submitted to the IWSLT 2020 Offline Speech Translation Task.
Ranked #3 on Speech-to-Text Translation on MuST-C EN->DE (using extra training data)
More specifically, a time-frequency bin is masked if the filterbank energy in this bin is less than a certain energy threshold.
Soft-attention based Neural Machine Translation (NMT) models have achieved promising results on several translation tasks.
Machine reading comprehension helps machines learn to utilize most of the human knowledge written in the form of text.
Ranked #1 on Question Answering on TriviaQA
In recent years many deep neural networks have been proposed to solve Reading Comprehension (RC) tasks.
Ranked #4 on Question Answering on NarrativeQA
Reading Comprehension (RC) of text is one of the fundamental tasks in natural language processing.
Ranked #66 on Question Answering on SQuAD1.1