RobBERT: a Dutch RoBERTa-based Language Model

17 Jan 2020Pieter DelobelleThomas WintersBettina Berendt

Pre-trained language models have been dominating the field of natural language processing in recent years, and have led to significant performance gains for various complex natural language tasks. One of the most prominent pre-trained language models is BERT (Bi-directional Encoders for Transformers), which was released as an English as well as a multilingual version... (read more)

PDF Abstract

Evaluation Results from the Paper


TASK DATASET MODEL METRIC NAME METRIC VALUE GLOBAL RANK COMPARE
Sentiment Analysis DBRD RobBERT Accuracy 94.422% # 1
Sentiment Analysis DBRD RobBERT F1 94.422% # 1