CodeBERT: A Pre-Trained Model for Programming and Natural Languages

We present CodeBERT, a bimodal pre-trained model for programming language (PL) and nat-ural language (NL). CodeBERT learns general-purpose representations that support downstream NL-PL applications such as natural language codesearch, code documentation generation, etc. We develop CodeBERT with Transformer-based neural architecture, and train it with a hybrid objective function that incorporates the pre-training task of replaced token detection, which is to detect plausible alternatives sampled from generators. This enables us to utilize both bimodal data of NL-PL pairs and unimodal data, where the former provides input tokens for model training while the latter helps to learn better generators. We evaluate CodeBERT on two NL-PL applications by fine-tuning model parameters. Results show that CodeBERT achieves state-of-the-art performance on both natural language code search and code documentation generation tasks. Furthermore, to investigate what type of knowledge is learned in CodeBERT, we construct a dataset for NL-PL probing, and evaluate in a zero-shot setting where parameters of pre-trained models are fixed. Results show that CodeBERT performs better than previous pre-trained models on NL-PL probing.

PDF Abstract Findings of 2020 PDF Findings of 2020 Abstract
Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Code Search CodeSearchNet CodeBERT Overall 76.0 # 4
Go 69.3 # 4
Ruby 70.6 # 4
Python 84.0 # 4
Java 86.8 # 3
JS 74.8 # 3
PHP 70.6 # 4
Code Documentation Generation CodeSearchNet Transformer Smoothed BLEU-4 14.31 # 6
Code Documentation Generation CodeSearchNet CodeBERT (MLM+RTD) Smoothed BLEU-4 15.99 # 1
Code Documentation Generation CodeSearchNet CodeBERT (MLM) Smoothed BLEU-4 15.55 # 2
Code Documentation Generation CodeSearchNet CodeBERT (RTD) Smoothed BLEU-4 15.03 # 4
Code Documentation Generation CodeSearchNet pre-train w/ code only Smoothed BLEU-4 15.15 # 3
Code Documentation Generation CodeSearchNet seq2seq Smoothed BLEU-4 13.36 # 7
Code Documentation Generation CodeSearchNet RoBERTa Smoothed BLEU-4 14.52 # 5
Code Documentation Generation CodeSearchNet - Go seq2seq Smoothed BLEU-4 23.48 # 6
Code Documentation Generation CodeSearchNet - Go CodeBERT (MLM) Smoothed BLEU-4 26.79 # 1
Code Documentation Generation CodeSearchNet - Go RoBERTa Smoothed BLEU-4 26.09 # 4
Code Documentation Generation CodeSearchNet - Go pre-train w/ code only Smoothed BLEU-4 26.39 # 3
Code Documentation Generation CodeSearchNet - Go CodeBERT (RTD) Smoothed BLEU-4 26.02 # 5
Code Documentation Generation CodeSearchNet - Go CodeBERT (MLM+RTD) Smoothed BLEU-4 26.66 # 2
Code Documentation Generation CodeSearchNet - Java seq2seq Smoothed BLEU-4 11.42 # 8
Code Documentation Generation CodeSearchNet - Java RoBERTa Smoothed BLEU-4 13.2 # 4
Code Documentation Generation CodeSearchNet - Java pre-train w/ code only Smoothed BLEU-4 13.07 # 5
Code Documentation Generation CodeSearchNet - Java CodeBERT (RTD) Smoothed BLEU-4 12.72 # 6
Code Documentation Generation CodeSearchNet - Java CodeBERT (MLM) Smoothed BLEU-4 13.59 # 3
Code Documentation Generation CodeSearchNet - Java CodeBERT (MLM+RTD) Smoothed BLEU-4 14.56 # 2
Code Documentation Generation CodeSearchNet - Java Transformer Smoothed BLEU-4 12.57 # 7
Code Documentation Generation CodeSearchNet - JavaScript RoBERTa Smoothed BLEU-4 5.72 # 8
Code Documentation Generation CodeSearchNet - JavaScript seq2seq Smoothed BLEU-4 6.88 # 7
Code Documentation Generation CodeSearchNet - JavaScript CodeBERT (MLM+RTD) Smoothed BLEU-4 9.54 # 3
Code Documentation Generation CodeSearchNet - JavaScript CodeBERT (RTD) Smoothed BLEU-4 8.73 # 4
Code Documentation Generation CodeSearchNet - JavaScript pre-train w/ code only Smoothed BLEU-4 8.3 # 6
Code Documentation Generation CodeSearchNet - JavaScript Transformer Smoothed BLEU-4 25.61 # 1
Code Documentation Generation CodeSearchNet - JavaScript CodeBERT (MLM) Smoothed BLEU-4 8.51 # 5
Code Documentation Generation CodeSearchNet - Php Transformer Smoothed BLEU-4 18.25 # 8
Code Documentation Generation CodeSearchNet - Php CodeBERT (RTD) Smoothed BLEU-4 20.25 # 5
Code Documentation Generation CodeSearchNet - Php CodeBERT (MLM) Smoothed BLEU-4 21 # 3
Code Documentation Generation CodeSearchNet - Php seq2seq Smoothed BLEU-4 18.4 # 7
Code Documentation Generation CodeSearchNet - Php RoBERTa Smoothed BLEU-4 19.9 # 6
Code Documentation Generation CodeSearchNet - Php CodeBERT (MLM+RTD) Smoothed BLEU-4 21.32 # 2
Code Documentation Generation CodeSearchNet - Php pre-train w/ code only Smoothed BLEU-4 20.71 # 4
Code Documentation Generation CodeSearchNet - Python RoBERTa Smoothed BLEU-4 14.92 # 5
Code Documentation Generation CodeSearchNet - Python pre-train w/ code only Smoothed BLEU-4 15.05 # 4
Code Documentation Generation CodeSearchNet - Python seq2seq Smoothed BLEU-4 13.04 # 7
Code Documentation Generation CodeSearchNet - Python CodeBERT (MLM) Smoothed BLEU-4 15.48 # 2
Code Documentation Generation CodeSearchNet - Python CodeBERT (MLM+RTD) Smoothed BLEU-4 15.41 # 3
Code Documentation Generation CodeSearchNet - Python Transformer Smoothed BLEU-4 13.44 # 6
Code Documentation Generation CodeSearchNet - Ruby seq2seq Smoothed BLEU-4 6.96 # 7
Code Documentation Generation CodeSearchNet - Ruby CodeBERT (MLM) Smoothed BLEU-4 7.95 # 3
Code Documentation Generation CodeSearchNet - Ruby CodeBERT (MLM+RTD) Smoothed BLEU-4 8.46 # 2
Code Documentation Generation CodeSearchNet - Ruby RoBERTa Smoothed BLEU-4 7.26 # 6
Code Documentation Generation CodeSearchNet - Ruby Transformer Smoothed BLEU-4 7.87 # 4
Code Documentation Generation CodeSearchNet - Ruby pre-train w/ code only Smoothed BLEU-4 7.39 # 5
Type prediction ManyTypes4TypeScript CodeBERT Average Accuracy 61.72 # 2
Average Precision 59.34 # 2
Average Recall 59.80 # 2
Average F1 59.57 # 2

Methods