Paper

BERTweet: A pre-trained language model for English Tweets

We present BERTweet, the first public large-scale pre-trained language model for English Tweets. Our BERTweet is trained using the RoBERTa pre-training procedure (Liu et al., 2019), with the same model configuration as BERT-base (Devlin et al., 2019)... (read more)

Results in Papers With Code
(↓ scroll down to see all results)