Table Search Using a Deep Contextualized Language Model

19 May 2020Zhiyu ChenMohamed TrabelsiJeff HeflinYinan XuBrian D. Davison

Pretrained contextualized language models such as BERT have achieved impressive results on various natural language processing benchmarks. Benefiting from multiple pretraining tasks and large scale training corpora, pretrained models can capture complex syntactic word relations... (read more)

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper