Search Results for author: Michael A. Hedderich

Found 15 papers, 9 papers with code

Meta Self-Refinement for Robust Learning with Weak Supervision

no code implementations15 May 2022 Dawei Zhu, Xiaoyu Shen, Michael A. Hedderich, Dietrich Klakow

However, labels from weak supervision can be rather noisy and the high capacity of DNNs makes them easy to overfit the noisy labels.

Proceedings of the First Workshop on Weakly Supervised Learning (WeaSuL)

no code implementations8 Jul 2021 Michael A. Hedderich, Benjamin Roth, Katharina Kann, Barbara Plank, Alex Ratner, Dietrich Klakow

Welcome to WeaSuL 2021, the First Workshop on Weakly Supervised Learning, co-located with ICLR 2021.

ANEA: Distant Supervision for Low-Resource Named Entity Recognition

1 code implementation25 Feb 2021 Michael A. Hedderich, Lukas Lange, Dietrich Klakow

Distant supervision allows obtaining labeled training corpora for low-resource settings where only limited hand-annotated data exists.

Low Resource Named Entity Recognition Named Entity Recognition

Analysing the Noise Model Error for Realistic Noisy Label Data

3 code implementations24 Jan 2021 Michael A. Hedderich, Dawei Zhu, Dietrich Klakow

Distant and weak supervision allow to obtain large amounts of labeled training data quickly and cheaply, but these automatic annotations tend to contain a high amount of errors.

On the Interplay Between Fine-tuning and Sentence-level Probing for Linguistic Knowledge in Pre-trained Transformers

no code implementations EMNLP (BlackboxNLP) 2020 Marius Mosbach, Anna Khokhlova, Michael A. Hedderich, Dietrich Klakow

Our analysis reveals that while fine-tuning indeed changes the representations of a pre-trained model and these changes are typically larger for higher layers, only in very few cases, fine-tuning has a positive effect on probing accuracy that is larger than just using the pre-trained model with a strong pooling method.

Learning Functions to Study the Benefit of Multitask Learning

no code implementations9 Jun 2020 Gabriele Bettgenhäuser, Michael A. Hedderich, Dietrich Klakow

Although multitask learning has achieved improved performance in some problems, there are also tasks that lose performance when trained together.

Mathematical Proofs

Feature-Dependent Confusion Matrices for Low-Resource NER Labeling with Noisy Labels

1 code implementation IJCNLP 2019 Lukas Lange, Michael A. Hedderich, Dietrich Klakow

In low-resource settings, the performance of supervised labeling models can be improved with automatically annotated or distantly supervised data, which is cheap to create but often noisy.

Low Resource Named Entity Recognition Named Entity Recognition +3

Using Multi-Sense Vector Embeddings for Reverse Dictionaries

1 code implementation WS 2019 Michael A. Hedderich, Andrew Yates, Dietrich Klakow, Gerard de Melo

However, they typically cannot serve as a drop-in replacement for conventional single-sense embeddings, because the correct sense vector needs to be selected for each word.

Training a Neural Network in a Low-Resource Setting on Automatically Annotated Noisy Data

1 code implementation WS 2018 Michael A. Hedderich, Dietrich Klakow

Manually labeled corpora are expensive to create and often not available for low-resource languages or domains.

NER

Cannot find the paper you are looking for? You can Submit a new open access paper.