Search Results for author: Matthias Lindemann

Found 11 papers, 5 papers with code

Automatically Discarding Straplines to Improve Data Quality for Abstractive News Summarization

no code implementations nlppower (ACL) 2022 Amr Keleg, Matthias Lindemann, Danyang Liu, Wanqiu Long, Bonnie L. Webber

Automatic evaluation indicates that removing straplines and noise from the training data of a news summarizer results in higher quality summaries, with improvements as high as 7 points ROUGE score.

News Summarization

Cache & Distil: Optimising API Calls to Large Language Models

no code implementations20 Oct 2023 Guillem Ramírez, Matthias Lindemann, Alexandra Birch, Ivan Titov

To curtail the frequency of these calls, one can employ a smaller language model -- a student -- which is continuously trained on the responses of the LLM.

Active Learning Language Modelling +1

Injecting a Structural Inductive Bias into a Seq2Seq Model by Simulation

no code implementations1 Oct 2023 Matthias Lindemann, Alexander Koller, Ivan Titov

Strong inductive biases enable learning from little data and help generalization outside of the training distribution.

Few-Shot Learning Inductive Bias +1

Compositional Generalization without Trees using Multiset Tagging and Latent Permutations

1 code implementation26 May 2023 Matthias Lindemann, Alexander Koller, Ivan Titov

Our model outperforms pretrained seq2seq models and prior work on realistic semantic parsing tasks that require generalization to longer examples.

Inductive Bias Semantic Parsing +1

Compositional Generalisation with Structured Reordering and Fertility Layers

1 code implementation6 Oct 2022 Matthias Lindemann, Alexander Koller, Ivan Titov

Seq2seq models have been shown to struggle with compositional generalisation, i. e. generalising to new and potentially more complex structures than seen during training.

Semantic Parsing

Fast semantic parsing with well-typedness guarantees

1 code implementation EMNLP 2020 Matthias Lindemann, Jonas Groschwitz, Alexander Koller

AM dependency parsing is a linguistically principled method for neural semantic parsing with high accuracy across multiple graphbanks.

Dependency Parsing Semantic Parsing

Normalizing Compositional Structures Across Graphbanks

1 code implementation COLING 2020 Lucia Donatelli, Jonas Groschwitz, Alexander Koller, Matthias Lindemann, Pia Weißenhorn

The emergence of a variety of graph-based meaning representations (MRs) has sparked an important conversation about how to adequately represent semantic structure.

Multi-Task Learning Semantic Parsing

Saarland at MRP 2019: Compositional parsing across all graphbanks

no code implementations CONLL 2019 Lucia Donatelli, Meaghan Fowlie, Jonas Groschwitz, Alex Koller, er, Matthias Lindemann, Mario Mina, Pia Wei{\ss}enhorn

We describe the Saarland University submission to the shared task on Cross-Framework Meaning Representation Parsing (MRP) at the 2019 Conference on Computational Natural Language Learning (CoNLL).

Compositional Semantic Parsing Across Graphbanks

1 code implementation ACL 2019 Matthias Lindemann, Jonas Groschwitz, Alexander Koller

Most semantic parsers that map sentences to graph-based meaning representations are hand-designed for specific graphbanks.

Multi-Task Learning Semantic Parsing

Verb-Second Effect on Quantifier Scope Interpretation

no code implementations WS 2019 Asad Sayeed, Matthias Lindemann, Vera Demberg

Sentences like {``}Every child climbed a tree{''} have at least two interpretations depending on the precedence order of the universal quantifier and the indefinite.

World Knowledge

AMR Dependency Parsing with a Typed Semantic Algebra

no code implementations ACL 2018 Jonas Groschwitz, Matthias Lindemann, Meaghan Fowlie, Mark Johnson, Alexander Koller

We present a semantic parser for Abstract Meaning Representations which learns to parse strings into tree representations of the compositional structure of an AMR graph.

Dependency Parsing

Cannot find the paper you are looking for? You can Submit a new open access paper.