no code implementations • EACL (AdaptNLP) 2021 • Sanjeev Kumar Karn, Francine Chen, Yan-Ying Chen, Ulli Waltinger, Hinrich Schuetze
Interleaved texts, where posts belonging to different threads occur in a sequence, commonly occur in online chat posts, so that it can be time-consuming to quickly obtain an overview of the discussions.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Nina Poerner, Ulli Waltinger, Hinrich Schütze
Domain adaptation of Pretrained Language Models (PTLMs) is typically achieved by unsupervised pretraining on target-domain text.
no code implementations • 15 Jan 2020 • Florian Buettner, John Piorkowski, Ian McCulloh, Ulli Waltinger
To facilitate the widespread acceptance of AI systems guiding decision-making in real-world applications, it is key that solutions comprise trustworthy, integrated human-AI systems.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Nina Poerner, Ulli Waltinger, Hinrich Schütze
We present a novel way of injecting factual knowledge about entities into the pretrained BERT model (Devlin et al., 2019): We align Wikipedia2Vec entity vectors (Yamada et al., 2016) with BERT's native wordpiece vector space and use the aligned entity vectors as if they were wordpiece vectors.
no code implementations • ACL 2020 • Nina Poerner, Ulli Waltinger, Hinrich Schütze
We address the task of unsupervised Semantic Textual Similarity (STS) by ensembling diverse pre-trained sentence encoders into sentence meta-embeddings.
no code implementations • 25 Sep 2019 • Sanjeev Kumar Karn, Francine Chen, Yan-Ying Chen, Ulli Waltinger, Hinrich Schütze
The interleaved posts are encoded hierarchically, i. e., word-to-word (words in a post) followed by post-to-post (posts in a channel).
no code implementations • 5 Jun 2019 • Sanjeev Kumar Karn, Francine Chen, Yan-Ying Chen, Ulli Waltinger, Hinrich Schütze
Interleaved texts, where posts belonging to different threads occur in one sequence, are a common occurrence, e. g., online chat conversations.
2 code implementations • NAACL 2019 • Sanjeev Kumar Karn, Mark Buckley, Ulli Waltinger, Hinrich Schütze
In this work, we define the task of teaser generation and provide an evaluation benchmark and baseline systems for the process of generating teasers.
no code implementations • EACL 2017 • Sanjeev Karn, Ulli Waltinger, Hinrich Sch{\"u}tze
We address fine-grained entity classification and propose a novel attention-based recurrent neural network (RNN) encoder-decoder that generates paths in the type hierarchy and can be trained end-to-end.
no code implementations • LREC 2014 • Artem Ostankov, Florian R{\"o}hrbein, Ulli Waltinger
This paper presents Linked Health Answers, a natural language question answering systems that utilizes health data drawn from the Linked Data Cloud.
no code implementations • LREC 2012 • Simon Clematide, Stefan Gindl, Manfred Klenner, Stefanos Petrakis, Robert Remus, Josef Ruppenhofer, Ulli Waltinger, Michael Wiegand
The construction of the corpus is based on the manual annotation of 270 German-language sentences considering three different layers of granularity.