no code implementations • SemEval (NAACL) 2022 • Timothee Mickus, Kees Van Deemter, Mathieu Constant, Denis Paperno
Word embeddings have advanced the state of the art in NLP across numerous tasks.
no code implementations • 25 Mar 2024 • Shaoxiong Ji, Timothee Mickus, Vincent Segonne, Jörg Tiedemann
We furthermore provide evidence through similarity measures and investigation of parameters that this lack of positive influence is due to output separability -- which we argue is of use for machine translation but detrimental elsewhere.
1 code implementation • 12 Mar 2024 • Timothee Mickus, Stig-Arne Grönroos, Joseph Attieh, Michele Boggia, Ona de Gibert, Shaoxiong Ji, Niki Andreas Lopi, Alessandro Raganato, Raúl Vázquez, Jörg Tiedemann
NLP in the age of monolithic large language models is approaching its limits in terms of size and information that can be handled.
no code implementations • 12 Mar 2024 • Timothee Mickus, Elaine Zosa, Raúl Vázquez, Teemu Vahtola, Jörg Tiedemann, Vincent Segonne, Alessandro Raganato, Marianna Apidianaki
This paper presents the results of the SHROOM, a shared task focused on detecting hallucinations: outputs from natural language generation (NLG) systems that are fluent, yet inaccurate.
no code implementations • 5 Feb 2024 • Timothee Mickus, Stig-Arne Grönroos, Joseph Attieh
Whether embedding spaces use all their dimensions equally, i. e., whether they are isotropic, has been a recent subject of discussion.
no code implementations • 18 Oct 2023 • Timothee Mickus, Elaine Zosa, Denis Paperno
Grounding has been argued to be a crucial component towards the development of more complete and truly semantically competent artificial intelligence systems.
1 code implementation • 10 Oct 2023 • Timothee Mickus, Raúl Vázquez
A recent body of work has demonstrated that Transformer embeddings can be linearly decomposed into well-defined sums of factors, that can in turn be related to specific network inputs or components.
no code implementations • 14 Jun 2023 • Vincent Segonne, Timothee Mickus
Definition Modeling, the task of generating definitions, was first proposed as a means to evaluate the semantic quality of word embeddings-a coherent lexical semantic representations of a word in context should contain all the information necessary to generate its definition.
no code implementations • 7 Jun 2022 • Timothee Mickus, Denis Paperno, Mathieu Constant
Pretrained embeddings based on the Transformer architecture have taken the NLP community by storm.
1 code implementation • 27 May 2022 • Timothee Mickus, Kees Van Deemter, Mathieu Constant, Denis Paperno
Word embeddings have advanced the state of the art in NLP across numerous tasks.
no code implementations • 17 Aug 2021 • Timothee Mickus, Mathieu Constant, Denis Paperno
Can language models learn grounded representations from text distribution alone?
1 code implementation • 7 Dec 2020 • Timothee Mickus, Timothée Bernard, Denis Paperno
Compositionality is a widely discussed property of natural languages, although its exact definition has been elusive.
no code implementations • COLING 2020 • Timothee Mickus, Timoth{\'e}e Bernard, Denis Paperno
Compositionality is a widely discussed property of natural languages, although its exact definition has been elusive.
no code implementations • JEPTALNRECITAL 2020 • Timothee Mickus, Mathieu Constant, Denis Paperno
La g{\'e}n{\'e}ration de d{\'e}finitions est une t{\^a}che r{\'e}cente qui vise {\`a} produire des d{\'e}finitions lexicographiques {\`a} partir de plongements lexicaux.
no code implementations • WS 2019 • Timothee Mickus, Denis Paperno, Mathieu Constant
Defining words in a textual context is a useful task both for practical purposes and for gaining insight into distributed word representations.
no code implementations • 13 Nov 2019 • Timothee Mickus, Denis Paperno, Mathieu Constant, Kees Van Deemter
Contextualized word embeddings, i. e. vector representations for words in context, are naturally seen as an extension of previous noncontextual distributional semantic models.