Search Results for author: Loïc Vial

Found 5 papers, 2 papers with code

The LIG system for the English-Czech Text Translation Task of IWSLT 2019

no code implementations EMNLP (IWSLT) 2019 Loïc Vial, Benjamin Lecouteux, Didier Schwab, Hang Le, Laurent Besacier

Therefore, we implemented a Transformer-based encoder-decoder neural system which is able to use the output of a pre-trained language model as input embeddings, and we compared its performance under three configurations: 1) without any pre-trained language model (constrained), 2) using a language model trained on the monolingual parts of the allowed English-Czech data (constrained), and 3) using a language model trained on a large quantity of external monolingual data (unconstrained).

Language Modelling Machine Translation +1

Sense Vocabulary Compression through the Semantic Knowledge of WordNet for Neural Word Sense Disambiguation

2 code implementations GWC 2019 Loïc Vial, Benjamin Lecouteux, Didier Schwab

In this article, we tackle the issue of the limited quantity of manually sense annotated corpora for the task of word sense disambiguation, by exploiting the semantic relationships between senses such as synonymy, hypernymy and hyponymy, in order to compress the sense vocabulary of Princeton WordNet, and thus reduce the number of different sense tags that must be observed to disambiguate all words of the lexical database.

Word Sense Disambiguation

Improving the Coverage and the Generalization Ability of Neural Word Sense Disambiguation through Hypernymy and Hyponymy Relationships

no code implementations2 Nov 2018 Loïc Vial, Benjamin Lecouteux, Didier Schwab

Our method leads to state of the art results on most WSD evaluation tasks, while improving the coverage of supervised systems, reducing the training time and the size of the models, without additional training data.

Word Sense Disambiguation

Comparison of Global Algorithms in Word Sense Disambiguation

no code implementations7 Apr 2017 Loïc Vial, Andon Tchechmedjiev, Didier Schwab

We find that CSA, GA and SA all eventually converge to similar results (0. 98 F1 score), but CSA gets there faster (in fewer scorer calls) and reaches up to 0. 95 F1 before SA in fewer scorer calls.

Word Sense Disambiguation

Cannot find the paper you are looking for? You can Submit a new open access paper.