Search Results for author: Ludovic Tanguy

Found 20 papers, 0 papers with code

Impact de la structure logique des documents sur les mod\`eles distributionnels : exp\'erimentations sur le corpus TALN (Impact of document structure on distributional semantics models: a case study on NLP research articles )

no code implementations JEPTALNRECITAL 2020 Ludovic Tanguy, C{\'e}cile Fabre, Yoann Bard

Nous pr{\'e}sentons une exp{\'e}rience visant {\`a} mesurer en quoi la structure logique d{'}un document impacte les repr{\'e}sentations lexicales dans les mod{\`e}les de s{\'e}mantique distributionnelle.

Which Dependency Parser to Use for Distributional Semantics in a Specialized Domain?

no code implementations LREC 2020 Pauline Brunet, Olivier Ferret, Ludovic Tanguy

We present a study whose objective is to compare several dependency parsers for English applied to a specialized corpus for building distributional count-based models from syntactic dependencies.

Collecting Tweets to Investigate Regional Variation in Canadian English

no code implementations LREC 2020 Filip Miletic, Anne Przewozny-Desriaux, Ludovic Tanguy

We present a 78. 8-million-tweet, 1. 3-billion-word corpus aimed at studying regional variation in Canadian English with a specific focus on the dialect regions of Toronto, Montreal, and Vancouver.

Extrinsic Evaluation of French Dependency Parsers on a Specialized Corpus: Comparison of Distributional Thesauri

no code implementations LREC 2020 Ludovic Tanguy, Pauline Brunet, Olivier Ferret

We present a study in which we compare 11 different French dependency parsers on a specialized corpus (consisting of research articles on NLP from the proceedings of the TALN conference).

Investigating the Stability of Concrete Nouns in Word Embeddings

no code implementations WS 2019 B{\'e}n{\'e}dicte Pierrejean, Ludovic Tanguy

We know that word embeddings trained using neural-based methods (such as word2vec SGNS) are sensitive to stability problems and that across two models trained using the exact same set of parameters, the nearest neighbors of a word are likely to change.

Word Embeddings

Predicting Word Embeddings Variability

no code implementations SEMEVAL 2018 B{\'e}n{\'e}dicte Pierrejean, Ludovic Tanguy

Neural word embeddings models (such as those built with word2vec) are known to have stability problems: when retraining a model with the exact same hyperparameters, words neighborhoods may change.

Word Embeddings

Etude de la reproductibilit\'e des word embeddings : rep\'erage des zones stables et instables dans le lexique (Reproducibility of word embeddings : identifying stable and unstable zones in the semantic space)

no code implementations JEPTALNRECITAL 2018 B{\'e}n{\'e}dicte Pierrejean, Ludovic Tanguy

Les mod{\`e}les vectoriels de s{\'e}mantique distributionnelle (ou word embeddings), notamment ceux produits par les m{\'e}thodes neuronales, posent des questions de reproductibilit{\'e} et donnent des repr{\'e}sentations diff{\'e}rentes {\`a} chaque utilisation, m{\^e}me sans modifier leurs param{\`e}tres.

Word Embeddings

Analyse d'une t\^ache de substitution lexicale : quelles sont les sources de difficult\'e ? (Difficulty analysis for a lexical substitution task)

no code implementations JEPTALNRECITAL 2016 Ludovic Tanguy, C{\'e}cile Fabre, Camille Mercier

Nous proposons dans cet article une analyse des r{\'e}sultats de la campagne SemDis 2014 qui proposait une t{\^a}che de substitution lexicale en fran{\c{c}}ais.

Cannot find the paper you are looking for? You can Submit a new open access paper.