Search Results for author: Ulli Waltinger

Found 11 papers, 3 papers with code

LinkedHealthAnswers: Towards Linked Data-driven Question Answering for the Health Care Domain

no code implementations LREC 2014 Artem Ostankov, Florian R{\"o}hrbein, Ulli Waltinger

This paper presents Linked Health Answers, a natural language question answering systems that utilizes health data drawn from the Linked Data Cloud.

Predicate Detection Question Answering +1

End-to-End Trainable Attentive Decoder for Hierarchical Entity Classification

no code implementations EACL 2017 Sanjeev Karn, Ulli Waltinger, Hinrich Sch{\"u}tze

We address fine-grained entity classification and propose a novel attention-based recurrent neural network (RNN) encoder-decoder that generates paths in the type hierarchy and can be trained end-to-end.

Classification General Classification +1

News Article Teaser Tweets and How to Generate Them

2 code implementations NAACL 2019 Sanjeev Kumar Karn, Mark Buckley, Ulli Waltinger, Hinrich Schütze

In this work, we define the task of teaser generation and provide an evaluation benchmark and baseline systems for the process of generating teasers.

A Hierarchical Decoder with Three-level Hierarchical Attention to Generate Abstractive Summaries of Interleaved Texts

no code implementations5 Jun 2019 Sanjeev Kumar Karn, Francine Chen, Yan-Ying Chen, Ulli Waltinger, Hinrich Schütze

Interleaved texts, where posts belonging to different threads occur in one sequence, are a common occurrence, e. g., online chat conversations.

Generating Multi-Sentence Abstractive Summaries of Interleaved Texts

no code implementations25 Sep 2019 Sanjeev Kumar Karn, Francine Chen, Yan-Ying Chen, Ulli Waltinger, Hinrich Schütze

The interleaved posts are encoded hierarchically, i. e., word-to-word (words in a post) followed by post-to-post (posts in a channel).

Disentanglement Sentence

E-BERT: Efficient-Yet-Effective Entity Embeddings for BERT

1 code implementation Findings of the Association for Computational Linguistics 2020 Nina Poerner, Ulli Waltinger, Hinrich Schütze

We present a novel way of injecting factual knowledge about entities into the pretrained BERT model (Devlin et al., 2019): We align Wikipedia2Vec entity vectors (Yamada et al., 2016) with BERT's native wordpiece vector space and use the aligned entity vectors as if they were wordpiece vectors.

Entity Embeddings Entity Linking +3

Sentence Meta-Embeddings for Unsupervised Semantic Textual Similarity

no code implementations ACL 2020 Nina Poerner, Ulli Waltinger, Hinrich Schütze

We address the task of unsupervised Semantic Textual Similarity (STS) by ensembling diverse pre-trained sentence encoders into sentence meta-embeddings.

Dimensionality Reduction Semantic Textual Similarity +2

AAAI FSS-19: Human-Centered AI: Trustworthiness of AI Models and Data Proceedings

no code implementations15 Jan 2020 Florian Buettner, John Piorkowski, Ian McCulloh, Ulli Waltinger

To facilitate the widespread acceptance of AI systems guiding decision-making in real-world applications, it is key that solutions comprise trustworthy, integrated human-AI systems.

Autonomous Driving Decision Making +1

Few-Shot Learning of an Interleaved Text Summarization Model by Pretraining with Synthetic Data

no code implementations EACL (AdaptNLP) 2021 Sanjeev Kumar Karn, Francine Chen, Yan-Ying Chen, Ulli Waltinger, Hinrich Schuetze

Interleaved texts, where posts belonging to different threads occur in a sequence, commonly occur in online chat posts, so that it can be time-consuming to quickly obtain an overview of the discussions.

Disentanglement Few-Shot Learning +1

Cannot find the paper you are looking for? You can Submit a new open access paper.