Search Results for author: Benjamin Muller

Found 11 papers, 4 papers with code

Cross-Lingual GenQA: Open-Domain Question Answering with Answer Sentence Generation

no code implementations14 Oct 2021 Benjamin Muller, Luca Soldaini, Rik Koncel-Kedziorski, Eric Lind, Alessandro Moschitti

Recent approaches for question answering systems have achieved impressive performance on English by combining document-level retrieval with answer generation.

Answer Generation Generative Question Answering +1

First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT

1 code implementation EACL 2021 Benjamin Muller, Yanai Elazar, Benoît Sagot, Djamé Seddah

Such transfer emerges by fine-tuning on a task of interest in one language and evaluating on a distinct language, not seen during the fine-tuning.

Language Modelling Pretrained Language Models +1

Les mod\`eles de langue contextuels Camembert pour le fran\ccais : impact de la taille et de l'h\'et\'erog\'en\'eit\'e des donn\'ees d'entrainement (C AMEM BERT Contextual Language Models for French: Impact of Training Data Size and Heterogeneity )

no code implementations JEPTALNRECITAL 2020 Louis Martin, Benjamin Muller, Pedro Javier Ortiz Su{\'a}rez, Yoann Dupont, Laurent Romary, {\'E}ric Villemonte de la Clergerie, Beno{\^\i}t Sagot, Djam{\'e} Seddah

L{'}utilisation pratique de ces mod{\`e}les {---} dans toutes les langues sauf l{'}anglais {---} {\'e}tait donc limit{\'e}e. La sortie r{\'e}cente de plusieurs mod{\`e}les monolingues fond{\'e}s sur BERT (Devlin et al., 2019), notamment pour le fran{\c{c}}ais, a d{\'e}montr{\'e} l{'}int{\'e}r{\^e}t de ces mod{\`e}les en am{\'e}liorant l{'}{\'e}tat de l{'}art pour toutes les t{\^a}ches {\'e}valu{\'e}es.

SENTS

Enhancing BERT for Lexical Normalization

no code implementations WS 2019 Benjamin Muller, Benoit Sagot, Djam{\'e} Seddah

In this article, focusing on User Generated Content (UGC), we study the ability of BERT to perform lexical normalisation.

Language Modelling Lexical Normalization

ELMoLex: Connecting ELMo and Lexicon Features for Dependency Parsing

no code implementations CONLL 2018 Ganesh Jawahar, Benjamin Muller, Amal Fethi, Louis Martin, {\'E}ric Villemonte de la Clergerie, Beno{\^\i}t Sagot, Djam{\'e} Seddah

We augment the deep Biaffine (BiAF) parser (Dozat and Manning, 2016) with novel features to perform competitively: we utilize an indomain version of ELMo features (Peters et al., 2018) which provide context-dependent word representations; we utilize disambiguated, embedded, morphosyntactic features from lexicons (Sagot, 2018), which complements the existing feature set.

Dependency Parsing Language Modelling

Cannot find the paper you are looking for? You can Submit a new open access paper.