Search Results for author: Mihai Dascălu

Found 1 papers, 1 papers with code

Distilling the Knowledge of Romanian BERTs Using Multiple Teachers

1 code implementation LREC 2022 Andrei-Marius Avram, Darius Catrina, Dumitru-Clementin Cercel, Mihai Dascălu, Traian Rebedea, Vasile Păiş, Dan Tufiş

In this work, we introduce three light and fast versions of distilled BERT models for the Romanian language: Distil-BERT-base-ro, Distil-RoBERT-base, and DistilMulti-BERT-base-ro.

Dialect Identification Knowledge Distillation +9

Cannot find the paper you are looking for? You can Submit a new open access paper.