Search Results for author: Vera Demberg

Found 59 papers, 1 papers with code

Entity Enhancement for Implicit Discourse Relation Classification in the Biomedical Domain

no code implementations ACL 2021 Wei Shi, Vera Demberg

Implicit discourse relation classification is a challenging task, in particular when the text domain is different from the standard Penn Discourse Treebank (PDTB; Prasad et al., 2008) training corpus domain (Wall Street Journal in 1990s).

Implicit Discourse Relation Classification

Exploring the Potential of Lexical Paraphrases for Mitigating Noise-Induced Comprehension Errors

no code implementations18 Jul 2021 Anupama Chingacham, Vera Demberg, Dietrich Klakow

We evaluate the intelligibility of synonyms in context and find that choosing a lexical unit that is less risky to be misheard than its synonym introduced an average gain in comprehension of 37% at SNR -5 dB and 21% at SNR 0 dB for babble noise.

Speech Synthesis

Time-Aware Ancient Chinese Text Translation and Inference

1 code implementation7 Jul 2021 Ernie Chang, Yow-Ting Shiue, Hui-Syuan Yeh, Vera Demberg

In this paper, we aim to address the challenges surrounding the translation of ancient Chinese text: (1) The linguistic gap due to the difference in eras results in translations that are poor in quality, and (2) most translations are missing the contextual information that is often very crucial to understanding the text.


Jointly Improving Language Understanding and Generation with Quality-Weighted Weak Supervision of Automatic Labeling

no code implementations EACL 2021 Ernie Chang, Vera Demberg, Alex Marin

Neural natural language generation (NLG) and understanding (NLU) models are data-hungry and require massive amounts of annotated data to be competitive.

Text Generation

Neural Data-to-Text Generation with LM-based Text Augmentation

no code implementations EACL 2021 Ernie Chang, Xiaoyu Shen, Dawei Zhu, Vera Demberg, Hui Su

Our approach automatically augments the data available for training by (i) generating new text samples based on replacing specific values by alternative ones from the same category, (ii) generating new text samples based on GPT-2, and (iii) proposing an automatic method for pairing the new text samples with data samples.

Data-to-Text Generation Text Augmentation

Does the Order of Training Samples Matter? Improving Neural Data-to-Text Generation with Curriculum Learning

no code implementations EACL 2021 Ernie Chang, Hui-Syuan Yeh, Vera Demberg

Efforts have been dedicated to improving text generation systems by changing the order of training samples in a process known as curriculum learning.

Curriculum Learning Data-to-Text Generation

Story Generation with Rich Details

no code implementations COLING 2020 Fangzhou Zhai, Vera Demberg, Alexander Koller

Automatically generated stories need to be not only coherent, but also interesting.

Story Generation

Diverse and Relevant Visual Storytelling with Scene Graph Embeddings

no code implementations CONLL 2020 Xudong Hong, Rakshith Shetty, Asad Sayeed, Khushboo Mehra, Vera Demberg, Bernt Schiele

A problem in automatically generated stories for image sequences is that they use overly generic vocabulary and phrase structure and fail to match the distributional characteristics of human-generated text.

Story Generation Visual Storytelling

DART: A Lightweight Quality-Suggestive Data-to-Text Annotation Tool

no code implementations COLING 2020 Ernie Chang, Jeriah Caplinger, Alex Marin, Xiaoyu Shen, Vera Demberg

We present a lightweight annotation tool, the Data AnnotatoR Tool (DART), for the general task of labeling structured data with textual descriptions.

Active Learning

Unsupervised Pidgin Text Generation By Pivoting English Data and Self-Training

no code implementations18 Mar 2020 Ernie Chang, David Ifeoluwa Adelani, Xiaoyu Shen, Vera Demberg

In this work, we develop techniques targeted at bridging the gap between Pidgin English and English in the context of natural language generation.

Data-to-Text Generation Machine Translation +1

Improving Language Generation from Feature-Rich Tree-Structured Data with Relational Graph Convolutional Encoders

no code implementations WS 2019 Xudong Hong, Ernie Chang, Vera Demberg

The Multilingual Surface Realization Shared Task 2019 focuses on generating sentences from lemmatized sets of universal dependency parses with rich features.

Data Augmentation Text Generation

Crowdsourcing Discourse Relation Annotations by a Two-Step Connective Insertion Task

no code implementations WS 2019 Frances Yung, Vera Demberg, Merel Scholman

The perspective of being able to crowd-source coherence relations bears the promise of acquiring annotations for new texts quickly, which could then increase the size and variety of discourse-annotated corpora.

A Hybrid Model for Globally Coherent Story Generation

no code implementations WS 2019 Fangzhou Zhai, Vera Demberg, Pavel Shkadzko, Wei Shi, Asad Sayeed

The model exploits a symbolic text planning module to produce text plans, thus reducing the demand of data; a neural surface realization module then generates fluent text conditioned on the text plan.

Story Generation

Verb-Second Effect on Quantifier Scope Interpretation

no code implementations WS 2019 Asad Sayeed, Matthias Lindemann, Vera Demberg

Sentences like {``}Every child climbed a tree{''} have at least two interpretations depending on the precedence order of the universal quantifier and the indefinite.

Toward Bayesian Synchronous Tree Substitution Grammars for Sentence Planning

no code implementations WS 2018 David M. Howcroft, Dietrich Klakow, Vera Demberg

Developing conventional natural language generation systems requires extensive attention from human experts in order to craft complex sets of sentence planning rules.

Text Generation

Acquiring Annotated Data with Cross-lingual Explicitation for Implicit Discourse Relation Classification

no code implementations WS 2019 Wei Shi, Frances Yung, Vera Demberg

Implicit discourse relation classification is one of the most challenging and important tasks in discourse parsing, due to the lack of connective as strong linguistic cues.

Discourse Parsing General Classification +2

Do Speakers Produce Discourse Connectives Rationally?

no code implementations WS 2018 Frances Yung, Vera Demberg

A number of different discourse connectives can be used to mark the same discourse relation, but it is unclear what factors affect connective choice.

Learning distributed event representations with a multi-task approach

no code implementations SEMEVAL 2018 Xudong Hong, Asad Sayeed, Vera Demberg

Human world knowledge contains information about prototypical events and their participants and locations.

Multi-Task Learning

Improving Variational Encoder-Decoders in Dialogue Generation

no code implementations6 Feb 2018 Xiaoyu Shen, Hui Su, Shuzi Niu, Vera Demberg

Variational encoder-decoders (VEDs) have shown promising results in dialogue generation.

Dialogue Generation

G-TUNA: a corpus of referring expressions in German, including duration information

no code implementations WS 2017 David Howcroft, Jorrig Vogels, Vera Demberg

Corpora of referring expressions elicited from human participants in a controlled environment are an important resource for research on automatic referring expression generation.

Referring expression generation Text Generation

How compatible are our discourse annotations? Insights from mapping RST-DT and PDTB annotations

no code implementations28 Apr 2017 Vera Demberg, Fatemeh Torabi Asr, Merel Scholman

Discourse-annotated corpora are an important resource for the community, but they are often annotated according to different frameworks.

Psycholinguistic Models of Sentence Processing Improve Sentence Readability Ranking

no code implementations EACL 2017 David M. Howcroft, Vera Demberg

While previous research on readability has typically focused on document-level measures, recent work in areas such as natural language generation has pointed out the need of sentence-level readability measures.

Document-level Information Retrieval +2

On the Need of Cross Validation for Discourse Relation Classification

no code implementations EACL 2017 Wei Shi, Vera Demberg

The task of implicit discourse relation classification has received increased attention in recent years, including two CoNNL shared tasks on the topic.

Classification General Classification +4

A Systematic Study of Neural Discourse Models for Implicit Discourse Relation

no code implementations EACL 2017 Attapol Rutherford, Vera Demberg, Nianwen Xue

Here, we propose neural network models that are based on feedforward and long-short term memory architecture and systematically study the effects of varying structures.

Discourse Parsing

From OpenCCG to AI Planning: Detecting Infeasible Edges in Sentence Generation

no code implementations COLING 2016 Maximilian Schwenger, {\'A}lvaro Torralba, Joerg Hoffmann, David M. Howcroft, Vera Demberg

The search space in grammar-based natural language generation tasks can get very large, which is particularly problematic when generating long utterances or paragraphs.

Text Generation

Annotating Discourse Relations in Spoken Language: A Comparison of the PDTB and CCR Frameworks

no code implementations LREC 2016 Ines Rehbein, Merel Scholman, Vera Demberg

In discourse relation annotation, there is currently a variety of different frameworks being used, and most of them have been developed and employed mostly on written data.

German and English Treebanks and Lexica for Tree-Adjoining Grammars

no code implementations LREC 2012 Miriam Kaeshammer, Vera Demberg

We present a treebank and lexicon for German and English, which have been developed for PLTAG parsing.

Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.