1 code implementation • INLG (ACL) 2021 • Aleksandre Maskharashvili, Symon Stevens-Guille, Xintong Li, Michael White
Recent developments in natural language generation (NLG) have bolstered arguments in favor of re-introducing explicit coding of discourse relations in the input to neural models.
2 code implementations • INLG (ACL) 2021 • Xintong Li, Symon Stevens-Guille, Aleksandre Maskharashvili, Michael White
Neural approaches to natural language generation in task-oriented dialogue have typically required large amounts of annotated training data to achieve satisfactory performance, especially when generating from compositional inputs.
no code implementations • WS (NoDaLiDa) 2019 • Jean-Philippe Bernardy, Rasmus Blanck, Stergios Chatzikyriakidis, Shalom Lappin, Aleksandre Maskharashvili
In this way we construct a model, by specifying boxes for the predicates.
1 code implementation • ACL (WebNLG, INLG) 2020 • Xintong Li, Aleksandre Maskharashvili, Symon Jory Stevens-Guille, Michael White
In this paper, we report experiments on finetuning large pretrained models to realize resource description framework (RDF) triples to natural language.
1 code implementation • INLG (ACL) 2020 • Symon Stevens-Guille, Aleksandre Maskharashvili, Amy Isard, Xintong Li, Michael White
While classic NLG systems typically made use of hierarchically structured content plans that included discourse relations as central components, more recent neural approaches have mostly mapped simple, flat inputs to texts without representing discourse relations explicitly.
no code implementations • GWC 2019 • Jean-Philippe Bernardy, Aleksandre Maskharashvili
The first one leverages an existing mapping of words to feature vectors (fastText), and attempts to classify such vectors as within or outside of each class.
1 code implementation • SIGDIAL (ACL) 2022 • Symon Stevens-Guille, Aleksandre Maskharashvili, Xintong Li, Michael White
Our results suggest that including discourse relation information in the input of the model significantly improves the consistency with which it produces a correctly realized discourse relation in the output.
no code implementations • ICLR 2019 • Jean-Philippe Bernardy, Aleksandre Maskharashvili
The first one leverages an existing mapping of words to feature vectors (fasttext), and attempts to classify such vectors as within or outside of each class.