Search Results for author: Denis Paperno

Found 35 papers, 5 papers with code

On Learning Interpreted Languages with Recurrent Models

no code implementations CL (ACL) 2022 Denis Paperno

Can recurrent neural nets, inspired by human sequential data processing, learn to understand language?

Chinese Long and Short Form Choice Exploiting Neural Network Language Modeling Approaches

no code implementations CCL 2020 Lin Li, Kees Van Deemter, Denis Paperno

This paper presents our work in long and short form choice, a significant question of lexical choice, which plays an important role in many Natural Language Understanding tasks.

Language Modelling Natural Language Understanding

Grounded and Well-rounded: A Methodological Approach to the Study of Cross-modal and Cross-lingual Grounding

no code implementations18 Oct 2023 Timothee Mickus, Elaine Zosa, Denis Paperno

Grounding has been argued to be a crucial component towards the development of more complete and truly semantically competent artificial intelligence systems.

The Scenario Refiner: Grounding subjects in images at the morphological level

no code implementations20 Sep 2023 Claudia Tagliaferri, Sofia Axioti, Albert Gatt, Denis Paperno

Derivationally related words, such as "runner" and "running", exhibit semantic differences which also elicit different visual scenarios.

Leverage Points in Modality Shifts: Comparing Language-only and Multimodal Word Representations

1 code implementation4 Jun 2023 Aleksey Tikhonov, Lisa Bylinina, Denis Paperno

Multimodal embeddings aim to enrich the semantic information in neural representations of language compared to text-only models.

Visual Grounding Word Embeddings

Towards leveraging latent knowledge and Dialogue context for real-world conversational question answering

no code implementations17 Dec 2022 Shaomu Tan, Denis Paperno

In many real-world scenarios, the absence of external knowledge source like Wikipedia restricts question answering systems to rely on latent internal knowledge in limited dialogue data.

Conversational Question Answering Retrieval

Generating image captions with external encyclopedic knowledge

no code implementations10 Oct 2022 Sofia Nikiforova, Tejaswini Deoskar, Denis Paperno, Yoad Winter

Our approach includes a novel way of using image location to identify relevant open-domain facts in an external knowledge base, with their subsequent integration into the captioning pipeline at both the encoding and decoding stages.

Caption Generation Image Captioning +1

How to Dissect a Muppet: The Structure of Transformer Embedding Spaces

no code implementations7 Jun 2022 Timothee Mickus, Denis Paperno, Mathieu Constant

Pretrained embeddings based on the Transformer architecture have taken the NLP community by storm.

A Game Interface to Study Semantic Grounding in Text-Based Models

no code implementations17 Aug 2021 Timothee Mickus, Mathieu Constant, Denis Paperno

Can language models learn grounded representations from text distribution alone?

What Meaning-Form Correlation Has to Compose With

1 code implementation7 Dec 2020 Timothee Mickus, Timothée Bernard, Denis Paperno

Compositionality is a widely discussed property of natural languages, although its exact definition has been elusive.

Geo-Aware Image Caption Generation

no code implementations COLING 2020 Sofia Nikiforova, Tejaswini Deoskar, Denis Paperno, Yoad Winter

Standard image caption generation systems produce generic descriptions of images and do not utilize any contextual information or world knowledge.

Caption Generation Image Captioning +1

G\'en\'eration automatique de d\'efinitions pour le fran\ccais (Definition Modeling in French)

no code implementations JEPTALNRECITAL 2020 Timothee Mickus, Mathieu Constant, Denis Paperno

La g{\'e}n{\'e}ration de d{\'e}finitions est une t{\^a}che r{\'e}cente qui vise {\`a} produire des d{\'e}finitions lexicographiques {\`a} partir de plongements lexicaux.

What do you mean, BERT? Assessing BERT as a Distributional Semantics Model

no code implementations13 Nov 2019 Timothee Mickus, Denis Paperno, Mathieu Constant, Kees Van Deemter

Contextualized word embeddings, i. e. vector representations for words in context, are naturally seen as an extension of previous noncontextual distributional semantic models.

Position Sentence +1

Mark my Word: A Sequence-to-Sequence Approach to Definition Modeling

no code implementations WS 2019 Timothee Mickus, Denis Paperno, Mathieu Constant

Defining words in a textual context is a useful task both for practical purposes and for gaining insight into distributed word representations.

Choosing between Long and Short Word Forms in Mandarin

no code implementations WS 2019 Lin Li, Kees Van Deemter, Denis Paperno, Jingyu Fan

Between 80{\%} and 90{\%} of all Chinese words have long and short form such as 老虎/虎 (lao-hu/hu , tiger) (Duanmu:2013).

On learning an interpreted language with recurrent models

1 code implementation WS 2018 Denis Paperno

Can recurrent neural nets, inspired by human sequential data processing, learn to understand language?

RUSSE: The First Workshop on Russian Semantic Similarity

no code implementations15 Mar 2018 Alexander Panchenko, Natalia Loukachevitch, Dmitry Ustalov, Denis Paperno, Christian Meyer, Natalia Konstantinova

The paper gives an overview of the Russian Semantic Similarity Evaluation (RUSSE) shared task held in conjunction with the Dialogue 2015 conference.

Semantic Similarity Semantic Textual Similarity

Typology of Adjectives Benchmark for Compositional Distributional Models

no code implementations LREC 2016 Daria Ryzhova, Maria Kyuseva, Denis Paperno

In this paper we present a novel application of compositional distributional semantic models (CDSMs): prediction of lexical typology.

Semantic Similarity Semantic Textual Similarity

Deriving Boolean structures from distributional vectors

no code implementations TACL 2015 German Kruszewski, Denis Paperno, Marco Baroni

Corpus-based distributional semantic models capture degrees of semantic relatedness among the words of very large vocabularies, but have problems with logical phenomena such as entailment, that are instead elegantly handled by model-theoretic approaches, which, in turn, do not scale up.

Cannot find the paper you are looking for? You can Submit a new open access paper.