Search Results for author: Zdeněk Kasner

Found 6 papers, 4 papers with code

Text-in-Context: Token-Level Error Detection for Table-to-Text Generation

1 code implementation INLG (ACL) 2021 Zdeněk Kasner, Simon Mille, Ondřej Dušek

Our system can detect the errors automatically using a combination of a rule-based natural language generation (NLG) system and pretrained language models (LMs).

Language Modelling Pretrained Language Models +3

Neural Pipeline for Zero-Shot Data-to-Text Generation

1 code implementation ACL 2022 Zdeněk Kasner, Ondřej Dušek

In data-to-text (D2T) generation, training on in-domain data leads to overfitting to the data representation and repeating training data noise.

Data-to-Text Generation Pretrained Language Models

Evaluating Semantic Accuracy of Data-to-Text Generation with Natural Language Inference

1 code implementation INLG (ACL) 2020 Ondřej Dušek, Zdeněk Kasner

A major challenge in evaluating data-to-text (D2T) generation is measuring the semantic accuracy of the generated text, i. e. checking if the output text contains all and only facts supported by the input data.

Data-to-Text Generation Natural Language Inference

Data-to-Text Generation with Iterative Text Editing

1 code implementation INLG (ACL) 2020 Zdeněk Kasner, Ondřej Dušek

Our approach maximizes the completeness and semantic accuracy of the output text while leveraging the abilities of recent pre-trained models for text editing (LaserTagger) and language modeling (GPT-2) to improve the text fluency.

Data-to-Text Generation Domain Adaptation +2

Improving Fluency of Non-Autoregressive Machine Translation

no code implementations7 Apr 2020 Zdeněk Kasner, Jindřich Libovický, Jindřich Helcl

Non-autoregressive (nAR) models for machine translation (MT) manifest superior decoding speed when compared to autoregressive (AR) models, at the expense of impaired fluency of their outputs.

Machine Translation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.