Search Results for author: Massimo Nicosia

Found 16 papers, 1 papers with code

mmT5: Modular Multilingual Pre-Training Solves Source Language Hallucinations

no code implementations23 May 2023 Jonas Pfeiffer, Francesco Piccinno, Massimo Nicosia, Xinyi Wang, Machel Reid, Sebastian Ruder

Multilingual sequence-to-sequence models perform poorly with increased language coverage and fail to consistently generate text in the correct target language in few-shot settings.

Hallucination Natural Language Understanding

Translate & Fill: Improving Zero-Shot Multilingual Semantic Parsing with Synthetic Data

no code implementations Findings (EMNLP) 2021 Massimo Nicosia, Zhongdi Qu, Yasemin Altun

While multilingual pretrained language models (LMs) fine-tuned on a single language have shown substantial cross-lingual task transfer capabilities, there is still a wide performance gap in semantic parsing tasks when target language supervision is available.

Data Augmentation Semantic Parsing

Answering Conversational Questions on Structured Data without Logical Forms

no code implementations IJCNLP 2019 Thomas Müller, Francesco Piccinno, Massimo Nicosia, Peter Shaw, Yasemin Altun

We present a novel approach to answering sequential questions based on structured objects such as knowledge bases or tables without using a logical form as an intermediate representation.

Question Answering

Cannot find the paper you are looking for? You can Submit a new open access paper.