# Semi-Supervised Sequence Modeling with Cross-View Training

Kevin ClarkMinh-Thang LuongChristopher D. ManningQuoc V. Le

Unsupervised representation learning algorithms such as word2vec and ELMo improve the accuracy of many supervised NLP models, mainly because they can take advantage of large amounts of unlabeled text. However, the supervised models only learn from task-specific labeled data during the main training phase... (read more)

PDF Abstract

51,794