no code implementations • ACL 2022 • Samuel Broscheit, Quynh Do, Judith Gaspers
Experiments show that a state-of-the-art BERT-based model suffers performance loss under this drift.
no code implementations • EACL (AdaptNLP) 2021 • Judith Gaspers, Quynh Do, Tobias Röding, Melanie Bradford
This paper provides the first experimental study on the impact of using domain-specific representations on a BERT-based multi-task spoken language understanding (SLU) model for multi-domain applications.
no code implementations • COLING 2020 • Quynh Do, Judith Gaspers, Tobias Roding, Melanie Bradford
This paper addresses the question as to what degree a BERT-based multilingual Spoken Language Understanding (SLU) model can transfer knowledge across languages.
no code implementations • 6 Aug 2020 • Judith Gaspers, Quynh Do, Fabian Triefenbach
Despite the fact that data imbalance is becoming more and more common in real-world Spoken Language Understanding (SLU) applications, it has not been studied extensively in the literature.
no code implementations • IJCNLP 2019 • Quynh Do, Judith Gaspers
A typical cross-lingual transfer learning approach boosting model performance on a language is to pre-train the model on all available supervised data from another language.