Search Results for author: Max Müller-Eberstein

Found 11 papers, 7 papers with code

Can Humans Identify Domains?

1 code implementation2 Apr 2024 Maria Barrett, Max Müller-Eberstein, Elisa Bassignana, Amalie Brogaard Pauli, Mike Zhang, Rob van der Goot

Textual domain is a crucial property within the Natural Language Processing (NLP) community due to its effects on downstream model performance.

Sentence

Subspace Chronicles: How Linguistic Information Emerges, Shifts and Interacts during Language Model Training

no code implementations25 Oct 2023 Max Müller-Eberstein, Rob van der Goot, Barbara Plank, Ivan Titov

We identify critical learning phases across tasks and time, during which subspaces emerge, share information, and later disentangle to specialize.

Language Modelling Multi-Task Learning

Establishing Trustworthiness: Rethinking Tasks and Model Evaluation

no code implementations9 Oct 2023 Robert Litschko, Max Müller-Eberstein, Rob van der Goot, Leon Weber, Barbara Plank

Language understanding is a multi-faceted cognitive capability, which the Natural Language Processing (NLP) community has striven to model computationally for decades.

Spectral Probing

1 code implementation21 Oct 2022 Max Müller-Eberstein, Rob van der Goot, Barbara Plank

Linguistic information is encoded at varying timescales (subwords, phrases, etc.)

Informativeness

Evidence > Intuition: Transferability Estimation for Encoder Selection

1 code implementation20 Oct 2022 Elisa Bassignana, Max Müller-Eberstein, Mike Zhang, Barbara Plank

With the increase in availability of large pre-trained language models (LMs) in Natural Language Processing (NLP), it becomes critical to assess their fit for a specific target task a priori - as fine-tuning the entire space of available LMs is computationally prohibitive and unsustainable.

Structured Prediction

Sort by Structure: Language Model Ranking as Dependency Probing

no code implementations NAACL 2022 Max Müller-Eberstein, Rob van der Goot, Barbara Plank

Making an informed choice of pre-trained language model (LM) is critical for performance, yet environmentally costly, and as such widely underexplored.

Language Modelling Structured Prediction

Experimental Standards for Deep Learning in Natural Language Processing Research

1 code implementation13 Apr 2022 Dennis Ulmer, Elisa Bassignana, Max Müller-Eberstein, Daniel Varab, Mike Zhang, Rob van der Goot, Christian Hardmeier, Barbara Plank

The field of Deep Learning (DL) has undergone explosive growth during the last decade, with a substantial impact on Natural Language Processing (NLP) as well.

Probing for Labeled Dependency Trees

1 code implementation ACL 2022 Max Müller-Eberstein, Rob van der Goot, Barbara Plank

Probing has become an important tool for analyzing representations in Natural Language Processing (NLP).

Dependency Parsing Informativeness

Genre as Weak Supervision for Cross-lingual Dependency Parsing

1 code implementation EMNLP 2021 Max Müller-Eberstein, Rob van der Goot, Barbara Plank

Recent work has shown that monolingual masked language models learn to represent data-driven notions of language variation which can be used for domain-targeted training data selection.

Dependency Parsing Sentence

Cannot find the paper you are looking for? You can Submit a new open access paper.