Search Results for author: Gokmen Oz

Found 3 papers, 0 papers with code

Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks

no code implementations10 Oct 2022 Charith Peris, Lizhen Tan, Thomas Gueudre, Turan Gojayev, Pan Wei, Gokmen Oz

Yet, the generic corpora used to pretrain the teacher and the corpora associated with the downstream target domain are often significantly different, which raises a natural question: should the student be distilled over the generic corpora, so as to learn from high-quality teacher predictions, or over the downstream task corpora to align with finetuning?

domain classification intent-classification +5

Using multiple ASR hypotheses to boost i18n NLU performance

no code implementations ICON 2020 Charith Peris, Gokmen Oz, Khadige Abboud, Venkata sai Varada, Prashan Wanigasekara, Haidar Khan

For IC and NER multi-task experiments, when evaluating on the mismatched test set, we see improvements across all domains in German and in 17 out of 19 domains in Portuguese (improvements based on change in SeMER scores).

Abstractive Text Summarization Automatic Speech Recognition +10

Cannot find the paper you are looking for? You can Submit a new open access paper.