WS 2018

GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

WS 2018 jsalt18-sentence-repl/jiant

For natural language understanding (NLU) technology to be maximally useful, both practically and as a scientific object of study, it must be general: it must be able to process language in a way that is not exclusively tailored to any one specific task or dataset.

NATURAL LANGUAGE INFERENCE TRANSFER LEARNING

Findings of the E2E NLG Challenge

WS 2018 UFAL-DSG/tgen

This paper summarises the experimental setup and results of the first shared task on end-to-end (E2E) natural language generation (NLG) in spoken dialogue systems.

DATA-TO-TEXT GENERATION SPOKEN DIALOGUE SYSTEMS

Context-Free Transductions with Neural Stacks

WS 2018 viking-sudo-rm/StackNN

This paper analyzes the behavior of stack-augmented recurrent neural network (RNN) models.

LANGUAGE MODELLING

Deep Graph Convolutional Encoders for Structured Data to Text Generation

WS 2018 diegma/graph-2-text

Most previous work on neural text generation from graph-structured data relies on standard sequence-to-sequence methods.

DATA-TO-TEXT GENERATION GRAPH-TO-SEQUENCE

Grammar Induction with Neural Language Models: An Unusual Replication

WS 2018 nyu-mll/PRPN-Analysis

A substantial thread of recent work on latent tree learning has attempted to develop neural network models with parse-valued latent variables and train them on non-parsing tasks, in the hope of having them discover interpretable tree structure.

CONSTITUENCY PARSING LANGUAGE MODELLING

Interpreting Neural Networks With Nearest Neighbors

WS 2018 Eric-Wallace/deep-knn

However, the confidence of neural networks is not a robust measure of model uncertainty.

FEATURE IMPORTANCE TEXT CLASSIFICATION

Enriching the WebNLG corpus

WS 2018 ThiagoCF05/webnlg

This paper describes the enrichment of WebNLG corpus, with the aim to further extend its usefulness as a resource for evaluating common NLG tasks, including Discourse Ordering, Lexicalization and Referring Expression Generation.

MACHINE TRANSLATION TEXT GENERATION

Improving Context Modelling in Multimodal Dialogue Generation

WS 2018 shubhamagarwal92/mmd

In this work, we investigate the task of textual response generation in a multimodal task-oriented dialogue system.

DIALOGUE GENERATION

E2E NLG Challenge: Neural Models vs. Templates

WS 2018 UKPLab/e2e-nlg-challenge-2017

E2E NLG Challenge is a shared task on generating restaurant descriptions from sets of key-value pairs.

DATA-TO-TEXT GENERATION