Search Results for author: Mathieu Constant

Found 12 papers, 0 papers with code

Comparing linear and neural models for competitive MWE identification

no code implementations WS (NoDaLiDa) 2019 Hazem Al Saied, Marie Candito, Mathieu Constant

In this paper, we compare the use of linear versus neural classifiers in a greedy transition system for MWE identification.

Évaluation de méthodes et d’outils pour la lemmatisation automatique du français médiéval (Evaluation of methods and tools for automatic lemmatization in Old French)

no code implementations JEP/TALN/RECITAL 2021 Cristina Holgado, Alexei Lavrentiev, Mathieu Constant

Pour les langues historiques non stabilisées comme le français médiéval, la lemmatisation automatique présente toujours des défis, car cette langue connaît une forte variation graphique.

Lemmatization

A Game Interface to Study Semantic Grounding in Text-Based Models

no code implementations17 Aug 2021 Timothee Mickus, Mathieu Constant, Denis Paperno

Can language models learn grounded representations from text distribution alone?

G\'en\'eration automatique de d\'efinitions pour le fran\ccais (Definition Modeling in French)

no code implementations JEPTALNRECITAL 2020 Timothee Mickus, Mathieu Constant, Denis Paperno

La g{\'e}n{\'e}ration de d{\'e}finitions est une t{\^a}che r{\'e}cente qui vise {\`a} produire des d{\'e}finitions lexicographiques {\`a} partir de plongements lexicaux.

Rigor Mortis: Annotating MWEs with a Gamified Platform

no code implementations LREC 2020 Kar{\"e}n Fort, Bruno Guillaume, Yann-Alan Pilatte, Mathieu Constant, Nicolas Lef{\`e}bvre

We present here Rigor Mortis, a gamified crowdsourcing platform designed to evaluate the intuition of the speakers, then train them to annotate multi-word expressions (MWEs) in French corpora.

Mark my Word: A Sequence-to-Sequence Approach to Definition Modeling

no code implementations WS 2019 Timothee Mickus, Denis Paperno, Mathieu Constant

Defining words in a textual context is a useful task both for practical purposes and for gaining insight into distributed word representations.

What do you mean, BERT? Assessing BERT as a Distributional Semantics Model

no code implementations13 Nov 2019 Timothee Mickus, Denis Paperno, Mathieu Constant, Kees Van Deemter

Contextualized word embeddings, i. e. vector representations for words in context, are naturally seen as an extension of previous noncontextual distributional semantic models.

Word Embeddings

Survey: Multiword Expression Processing: A Survey

no code implementations CL 2017 Mathieu Constant, G{\"u}l{\c{s}}en Eryi{\v{g}}it, Johanna Monti, Lonneke van der Plas, Carlos Ramisch, Michael Rosner, Amalia Todirascu

The structure of linguistic processing that depends on the clear distinction between words and phrases has to be re-thought to accommodate MWEs.

Machine Translation

Annotation d'expressions polylexicales verbales en fran\ccais (Annotation of verbal multiword expressions in French)

no code implementations JEPTALNRECITAL 2017 C, Marie ito, Mathieu Constant, Carlos Ramisch, Agata Savary, Yannick Parmentier, Caroline Pasquer, Jean-Yves Antoine

Nous d{\'e}crivons la partie fran{\c{c}}aise des donn{\'e}es produites dans le cadre de la campagne multilingue PARSEME sur l{'}identification d{'}expressions polylexicales verbales (Savary et al., 2017).

Cannot find the paper you are looking for? You can Submit a new open access paper.