Search Results for author: Hicham El Boukkouri

Found 5 papers, 2 papers with code

Re-train or Train from Scratch? Comparing Pre-training Strategies of BERT in the Medical Domain

no code implementations LREC 2022 Hicham El Boukkouri, Olivier Ferret, Thomas Lavergne, Pierre Zweigenbaum

BERT models used in specialized domains all seem to be the result of a simple strategy: initializing with the original BERT and then resuming pre-training on a specialized corpus.

CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters

2 code implementations COLING 2020 Hicham El Boukkouri, Olivier Ferret, Thomas Lavergne, Hiroshi Noji, Pierre Zweigenbaum, Junichi Tsujii

Due to the compelling improvements brought by BERT, many recent representation models adopted the Transformer architecture as their main building block, consequently inheriting the wordpiece tokenization system despite it not being intrinsically linked to the notion of Transformers.

Clinical Concept Extraction Drug–drug Interaction Extraction +3

R\'e-entra\^\iner ou entra\^\iner soi-m\^eme ? Strat\'egies de pr\'e-entra\^\inement de BERT en domaine m\'edical (Re-train or train from scratch ? Pre-training strategies for BERT in the medical domain )

no code implementations JEPTALNRECITAL 2020 Hicham El Boukkouri

Les mod{\`e}les BERT employ{\'e}s en domaine sp{\'e}cialis{\'e} semblent tous d{\'e}couler d{'}une strat{\'e}gie assez simple : utiliser le mod{\`e}le BERT originel comme initialisation puis poursuivre l{'}entra{\^\i}nement de celuici sur un corpus sp{\'e}cialis{\'e}.

NER

Cannot find the paper you are looking for? You can Submit a new open access paper.