Search Results for author: Timothee Mickus

Found 18 papers, 4 papers with code

Can Machine Translation Bridge Multilingual Pretraining and Cross-lingual Transfer Learning?

no code implementations25 Mar 2024 Shaoxiong Ji, Timothee Mickus, Vincent Segonne, Jörg Tiedemann

We furthermore provide evidence through similarity measures and investigation of parameters that this lack of positive influence is due to output separability -- which we argue is of use for machine translation but detrimental elsewhere.

Cross-Lingual Transfer Machine Translation +5

SemEval-2024 Shared Task 6: SHROOM, a Shared-task on Hallucinations and Related Observable Overgeneration Mistakes

no code implementations12 Mar 2024 Timothee Mickus, Elaine Zosa, Raúl Vázquez, Teemu Vahtola, Jörg Tiedemann, Vincent Segonne, Alessandro Raganato, Marianna Apidianaki

This paper presents the results of the SHROOM, a shared task focused on detecting hallucinations: outputs from natural language generation (NLG) systems that are fluent, yet inaccurate.

Machine Translation Paraphrase Generation

Isotropy, Clusters, and Classifiers

no code implementations5 Feb 2024 Timothee Mickus, Stig-Arne Grönroos, Joseph Attieh

Whether embedding spaces use all their dimensions equally, i. e., whether they are isotropic, has been a recent subject of discussion.

Grounded and Well-rounded: A Methodological Approach to the Study of Cross-modal and Cross-lingual Grounding

no code implementations18 Oct 2023 Timothee Mickus, Elaine Zosa, Denis Paperno

Grounding has been argued to be a crucial component towards the development of more complete and truly semantically competent artificial intelligence systems.

Why bother with geometry? On the relevance of linear decompositions of Transformer embeddings

1 code implementation10 Oct 2023 Timothee Mickus, Raúl Vázquez

A recent body of work has demonstrated that Transformer embeddings can be linearly decomposed into well-defined sums of factors, that can in turn be related to specific network inputs or components.

Machine Translation Sentence

"Definition Modeling: To model definitions." Generating Definitions With Little to No Semantics

no code implementations14 Jun 2023 Vincent Segonne, Timothee Mickus

Definition Modeling, the task of generating definitions, was first proposed as a means to evaluate the semantic quality of word embeddings-a coherent lexical semantic representations of a word in context should contain all the information necessary to generate its definition.

Word Embeddings

How to Dissect a Muppet: The Structure of Transformer Embedding Spaces

no code implementations7 Jun 2022 Timothee Mickus, Denis Paperno, Mathieu Constant

Pretrained embeddings based on the Transformer architecture have taken the NLP community by storm.

A Game Interface to Study Semantic Grounding in Text-Based Models

no code implementations17 Aug 2021 Timothee Mickus, Mathieu Constant, Denis Paperno

Can language models learn grounded representations from text distribution alone?

What Meaning-Form Correlation Has to Compose With

1 code implementation7 Dec 2020 Timothee Mickus, Timothée Bernard, Denis Paperno

Compositionality is a widely discussed property of natural languages, although its exact definition has been elusive.

G\'en\'eration automatique de d\'efinitions pour le fran\ccais (Definition Modeling in French)

no code implementations JEPTALNRECITAL 2020 Timothee Mickus, Mathieu Constant, Denis Paperno

La g{\'e}n{\'e}ration de d{\'e}finitions est une t{\^a}che r{\'e}cente qui vise {\`a} produire des d{\'e}finitions lexicographiques {\`a} partir de plongements lexicaux.

Mark my Word: A Sequence-to-Sequence Approach to Definition Modeling

no code implementations WS 2019 Timothee Mickus, Denis Paperno, Mathieu Constant

Defining words in a textual context is a useful task both for practical purposes and for gaining insight into distributed word representations.

What do you mean, BERT? Assessing BERT as a Distributional Semantics Model

no code implementations13 Nov 2019 Timothee Mickus, Denis Paperno, Mathieu Constant, Kees Van Deemter

Contextualized word embeddings, i. e. vector representations for words in context, are naturally seen as an extension of previous noncontextual distributional semantic models.

Position Sentence +1

Cannot find the paper you are looking for? You can Submit a new open access paper.