no code implementations • RepL4NLP (ACL) 2022 • Hassan Soliman, Heike Adel, Mohamed H. Gad-Elrab, Dragan Milchevski, Jannik Strötgen
In particular, we represent the entities of different KGs in a joint vector space and address the questions of which data is best suited for creating and fine-tuning that space, and whether fine-tuning harms performance on the general domain.
1 code implementation • EMNLP (Eval4NLP) 2020 • Hanna Wecker, Annemarie Friedrich, Heike Adel
This paper adds to the ongoing discussion in the natural language processing community on how to choose a good development set.
no code implementations • 3 Oct 2024 • Mingyang Wang, Lukas Lange, Heike Adel, Jannik Strötgen, Hinrich Schütze
Evaluations on three model editing benchmarks show that SAUL is a practical and reliable solution for model editing outperforming state-of-the-art methods while maintaining generation quality and reducing computational overhead.
1 code implementation • 12 Sep 2024 • Zihang Peng, Daria Stepanova, Vinh Thinh Ho, Heike Adel, Alessandra Russo, Simon Ott
In this work, our goal is to verify to which extent the exploitation of LMs is helpful for improving the quality of rule learning systems.
no code implementations • 26 Jun 2024 • Mingyang Wang, Heike Adel, Lukas Lange, Jannik Strötgen, Hinrich Schütze
In real-world environments, continual learning is essential for machine learning models, as they need to acquire new knowledge incrementally without forgetting what they have already learned.
2 code implementations • 29 Apr 2024 • Wei Zhou, Mohsen Mesgar, Heike Adel, Annemarie Friedrich
To investigate these aspects, we create and publish a novel TQA evaluation benchmark in English.
1 code implementation • 31 Mar 2024 • Mingyang Wang, Heike Adel, Lukas Lange, Jannik Strötgen, Hinrich Schütze
Continual learning aims at incrementally acquiring new knowledge while not forgetting existing knowledge.
no code implementations • 8 Mar 2024 • Wei Zhou, Heike Adel, Hendrik Schuff, Ngoc Thang Vu
Attribution scores indicate the importance of different input parts and can, thus, explain model behaviour.
no code implementations • 23 Oct 2023 • Mingyang Wang, Heike Adel, Lukas Lange, Jannik Strötgen, Hinrich Schütze
However, not all languages positively influence each other and it is an open research question how to select the most suitable set of languages for multilingual training and avoid negative interference among languages whose characteristics or data distributions are not compatible.
1 code implementation • 4 May 2023 • Alon Jacovi, Hendrik Schuff, Heike Adel, Ngoc Thang Vu, Yoav Goldberg
Word-level saliency explanations ("heat maps over words") are often used to communicate feature-attribution in text-based models.
no code implementations • 28 Apr 2023 • Mingyang Wang, Heike Adel, Lukas Lange, Jannik Strötgen, Hinrich Schütze
In this work, we propose to leverage language-adaptive and task-adaptive pretraining on African texts and study transfer learning with source language selection on top of an African language-centric pretrained language model.
1 code implementation • 14 Feb 2023 • Koustava Goswami, Lukas Lange, Jun Araki, Heike Adel
Prompting pre-trained language models leads to promising results across natural language processing tasks but is less effective when applied in low-resource domains, due to the domain gap between the pre-training data and the downstream task.
no code implementations • 13 Oct 2022 • Hendrik Schuff, Heike Adel, Peng Qi, Ngoc Thang Vu
This approach assumes that explanations which reach higher proxy scores will also provide a greater benefit to human users.
1 code implementation • 20 May 2022 • Lukas Lange, Jannik Strötgen, Heike Adel, Dietrich Klakow
The detection and normalization of temporal expressions is an important task and preprocessing step for many applications.
1 code implementation • 27 Jan 2022 • Hendrik Schuff, Alon Jacovi, Heike Adel, Yoav Goldberg, Ngoc Thang Vu
In this work, we focus on this question through a study of saliency-based explanations over textual data.
1 code implementation • 16 Dec 2021 • Lukas Lange, Heike Adel, Jannik Strötgen, Dietrich Klakow
The field of natural language processing (NLP) has recently seen a large change towards using pre-trained language models for solving almost any task.
no code implementations • 17 Sep 2021 • Lukas Lange, Heike Adel, Jannik Strötgen
In this paper, we explore possible improvements of transformer models in a low-resource setting.
1 code implementation • EMNLP (BlackboxNLP) 2021 • Hendrik Schuff, Hsiu-Yu Yang, Heike Adel, Ngoc Thang Vu
For this, we investigate different sources of external knowledge and evaluate the performance of our models on in-domain data as well as on special transfer datasets that are designed to assess fine-grained reasoning capabilities.
no code implementations • 26 Jul 2021 • Hendrik Schuff, Heike Adel, Ngoc Thang Vu
In addition, we conduct a qualitative analysis of thought flow correction patterns and explore how thought flow predictions affect human users within a crowdsourcing study.
no code implementations • 22 Apr 2021 • Heike Adel, Jannik Strötgen
The performance of relation extraction models has increased considerably with the rise of neural networks.
1 code implementation • EMNLP 2021 • Lukas Lange, Jannik Strötgen, Heike Adel, Dietrich Klakow
For this, we study the effects of model transfer on sequence labeling across various domains and tasks and show that our methods based on model similarity and support vector machines are able to predict promising sources, resulting in performance increases of up to 24 F1 points.
1 code implementation • NAACL 2021 • Michael A. Hedderich, Lukas Lange, Heike Adel, Jannik Strötgen, Dietrich Klakow
Deep neural networks and huge language models are becoming omnipresent in natural language applications.
no code implementations • 23 Oct 2020 • Lukas Lange, Xiang Dai, Heike Adel, Jannik Strötgen
The recognition and normalization of clinical information, such as tumor morphology mentions, is an important, but complex process consisting of multiple subtasks.
1 code implementation • EMNLP 2021 • Lukas Lange, Heike Adel, Jannik Strötgen, Dietrich Klakow
Combining several embeddings typically improves performance in downstream tasks as different embeddings encode different information.
3 code implementations • COLING 2020 • Xiang Dai, Heike Adel
Simple yet effective data augmentation techniques have been proposed for sentence-level and sentence-pair natural language processing tasks.
1 code implementation • EMNLP 2020 • Hendrik Schuff, Heike Adel, Ngoc Thang Vu
The user study shows that our models increase the ability of the users to judge the correctness of the system and that scores like F1 are not enough to estimate the usefulness of a model in a practical setting with human users.
no code implementations • 2 Jul 2020 • Lukas Lange, Heike Adel, Jannik Strötgen
Natural language processing has huge potential in the medical domain which recently led to a lot of research in this field.
no code implementations • WS 2019 • Lukas Lange, Heike Adel, Jannik Strötgen
Named entity recognition has been extensively studied on English news texts.
1 code implementation • ACL 2020 • Annemarie Friedrich, Heike Adel, Federico Tomazic, Johannes Hingerl, Renou Benteau, Anika Maruscyk, Lukas Lange
With this paper, we publish our annotation guidelines, as well as our SOFC-Exp corpus consisting of 45 open-access scholarly articles annotated by domain experts.
no code implementations • WS 2020 • Lukas Lange, Anastasiia Iurshina, Heike Adel, Jannik Strötgen
Although temporal tagging is still dominated by rule-based systems, there have been recent attempts at neural temporal taggers.
Ranked #1 on Temporal Tagging on Catalan TimeBank 1.0
no code implementations • WS 2020 • Lukas Lange, Heike Adel, Jannik Strötgen
Recent work showed that embeddings from related languages can improve the performance of sequence tagging, even for monolingual models.
1 code implementation • ACL 2020 • Lukas Lange, Heike Adel, Jannik Strötgen
Exploiting natural language processing in the clinical domain requires de-identification, i. e., anonymization of personal information in texts.
no code implementations • 1 Oct 2019 • Heike Adel, Hinrich Schütze
In particular, we explore different ways of integrating the named entity types of the relation arguments into a neural network for relation classification, including a joint training and a structured prediction approach.
no code implementations • NAACL 2019 • Robert McHardy, Heike Adel, Roman Klinger
We therefore propose a novel model for satire detection with an adversarial component to control for the confounding variable of publication source.
no code implementations • 6 Nov 2018 • Heike Adel, Hinrich Schütze
Especially, it focuses on the coreference and classification component.
no code implementations • EMNLP 2018 • Heike Adel, Laura Ana Maria Bostan, Sean Papay, Sebastian Pad{\'o}, Roman Klinger
As a result, comparability of models across tasks is missing and their applicability to new tasks is limited.
no code implementations • 14 Aug 2018 • Heike Adel, Anton Bryl, David Weiss, Aliaksei Severyn
We study cross-lingual sequence tagging with little or no labeled data in the target language.
no code implementations • NAACL 2019 • Apostolos Kemos, Heike Adel, Hinrich Schütze
Character-level models of tokens have been shown to be effective at dealing with within-token noise and out-of-vocabulary words.
no code implementations • 26 Oct 2017 • Heike Adel, Hinrich Schütze
In this paper, we demonstrate the importance of coreference resolution for natural language processing on the example of the TAC Slot Filling shared task.
no code implementations • 4 Oct 2017 • Heike Adel, Ngoc Thang Vu, Katrin Kirchhoff, Dominic Telaar, Tanja Schultz
The experimental results reveal that Brown word clusters, part-of-speech tags and open-class words are the most effective at reducing the perplexity of factored language models on the Mandarin-English Code-Switching corpus SEAME.
Automatic Speech Recognition Automatic Speech Recognition (ASR) +3
no code implementations • 7 Aug 2017 • Yadollah Yaghoobzadeh, Heike Adel, Hinrich Schütze
This paper addresses the problem of corpus-level entity typing, i. e., inferring from a large corpus that an entity is a member of a class such as "food" or "artist".
no code implementations • EMNLP 2017 • Heike Adel, Hinrich Schütze
We introduce globally normalized convolutional neural networks for joint entity classification and relation extraction.
no code implementations • EACL 2017 • Heike Adel, Francine Chen, Yan-Ying Chen
Users often use social media to share their interest in products.
no code implementations • EACL 2017 • Yadollah Yaghoobzadeh, Heike Adel, Hinrich Schütze
For the second noise type, we propose ways to improve the integration of noisy entity type predictions into relation extraction.
no code implementations • EACL 2017 • Heike Adel, Hinrich Schütze
Neural networks with attention have proven effective for many natural language processing tasks.
no code implementations • 3 Oct 2016 • Hinrich Schuetze, Heike Adel, Ehsaneddin Asgari
We introduce the first generic text representation model that is completely nonsymbolic, i. e., it does not require the availability of a segmentation or tokenization method that attempts to identify words or other symbolic units in text.
no code implementations • NAACL 2016 • Ngoc Thang Vu, Heike Adel, Pankaj Gupta, Hinrich Schütze
This paper investigates two different neural architectures for the task of relation classification: convolutional neural networks and recurrent neural networks.
no code implementations • NAACL 2016 • Heike Adel, Benjamin Roth, Hinrich Schütze
We address relation classification in the context of slot filling, the task of finding and evaluating fillers like "Steve Jobs" for the slot X in "X founded Apple".