no code implementations • LREC 2022 • Rob van der Goot, Max Müller-Eberstein, Barbara Plank
For low-resource syntactic tasks, we observe impacts of segment embedding and multilingual BERT choice.
1 code implementation • 2 Apr 2024 • Maria Barrett, Max Müller-Eberstein, Elisa Bassignana, Amalie Brogaard Pauli, Mike Zhang, Rob van der Goot
Textual domain is a crucial property within the Natural Language Processing (NLP) community due to its effects on downstream model performance.
no code implementations • 25 Oct 2023 • Max Müller-Eberstein, Rob van der Goot, Barbara Plank, Ivan Titov
We identify critical learning phases across tasks and time, during which subspaces emerge, share information, and later disentangle to specialize.
no code implementations • 9 Oct 2023 • Robert Litschko, Max Müller-Eberstein, Rob van der Goot, Leon Weber, Barbara Plank
Language understanding is a multi-faceted cognitive capability, which the Natural Language Processing (NLP) community has striven to model computationally for decades.
1 code implementation • 21 Oct 2022 • Max Müller-Eberstein, Rob van der Goot, Barbara Plank
Linguistic information is encoded at varying timescales (subwords, phrases, etc.)
1 code implementation • 20 Oct 2022 • Elisa Bassignana, Max Müller-Eberstein, Mike Zhang, Barbara Plank
With the increase in availability of large pre-trained language models (LMs) in Natural Language Processing (NLP), it becomes critical to assess their fit for a specific target task a priori - as fine-tuning the entire space of available LMs is computationally prohibitive and unsustainable.
no code implementations • NAACL 2022 • Max Müller-Eberstein, Rob van der Goot, Barbara Plank
Making an informed choice of pre-trained language model (LM) is critical for performance, yet environmentally costly, and as such widely underexplored.
1 code implementation • 13 Apr 2022 • Dennis Ulmer, Elisa Bassignana, Max Müller-Eberstein, Daniel Varab, Mike Zhang, Rob van der Goot, Christian Hardmeier, Barbara Plank
The field of Deep Learning (DL) has undergone explosive growth during the last decade, with a substantial impact on Natural Language Processing (NLP) as well.
1 code implementation • ACL 2022 • Max Müller-Eberstein, Rob van der Goot, Barbara Plank
Probing has become an important tool for analyzing representations in Natural Language Processing (NLP).
1 code implementation • ACL (TLT, SyntaxFest) 2021 • Max Müller-Eberstein, Rob van der Goot, Barbara Plank
This work provides the first in-depth analysis of genre in Universal Dependencies (UD).
1 code implementation • EMNLP 2021 • Max Müller-Eberstein, Rob van der Goot, Barbara Plank
Recent work has shown that monolingual masked language models learn to represent data-driven notions of language variation which can be used for domain-targeted training data selection.