WS 2019

Improving Quality and Efficiency in Plan-based Neural Data-to-Text Generation

WS 2019 AmitMY/chimera

We follow the step-by-step approach to neural data-to-text generation we proposed in Moryossef et al (2019), in which the generation process is divided into a text-planning stage followed by a plan-realization stage.


A Stable Variational Autoencoder for Text Modelling

WS 2019 ruizheliUOA/HR-VAE

Variational Autoencoder (VAE) is a powerful method for learning representations of high-dimensional data.

KPTimes: A Large-Scale Dataset for Keyphrase Generation on News Documents

WS 2019 ygorg/KPTimes

Keyphrase generation is the task of predicting a set of lexical units that conveys the main content of a source text.

Rethinking Text Attribute Transfer: A Lexical Analysis

WS 2019 FranxYao/pivot_analysis

We apply this framework to existing datasets and models and show that: (1) the pivot words are strong features for the classification of sentence attributes; (2) to change the attribute of a sentence, many datasets only requires to change certain pivot words; (3) consequently, many transfer models only perform the lexical-level modification, while leaving higher-level sentence structures unchanged.


Automatic Quality Estimation for Natural Language Generation: Ranting (Jointly Rating and Ranking)

WS 2019 tuetschek/ratpred

We present a recurrent neural network based system for automatic quality estimation of natural language generation (NLG) outputs, which jointly learns to assign numerical ratings to individual outputs and to provide pairwise rankings of two different outputs.


Revisiting Challenges in Data-to-Text Generation with Fact Grounding

WS 2019 wanghm92/rw_fg

Data-to-text generation models face challenges in ensuring data fidelity by referring to the correct input source.


Let's FACE it. Finnish Poetry Generation with Aesthetics and Framing

WS 2019 mikahama/finmeter

We present a creative poem generator for the morphologically rich Finnish language.

A Good Sample is Hard to Find: Noise Injection Sampling and Self-Training for Neural Language Generation Models

WS 2019 kedz/noiseylg

Deep neural networks (DNN) are quickly becoming the de facto standard modeling method for many natural language generation (NLG) tasks.