Building Language Models for Text with Named Entities

Text in many domains involves a significant amount of named entities. Predict- ing the entity names is often challenging for a language model as they appear less frequent on the training corpus. In this paper, we propose a novel and effective approach to building a discriminative language model which can learn the entity names by leveraging their entity type information. We also introduce two benchmark datasets based on recipes and Java programming codes, on which we evalu- ate the proposed model. Experimental re- sults show that our model achieves 52.2% better perplexity in recipe generation and 22.06% on code generation than the state-of-the-art language models.

PDF Abstract ACL 2018 PDF ACL 2018 Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Code Generation Android Repos Entity Type Model Perplexity 2.65 # 1
Recipe Generation Now You're Cooking! Entity Type Model Perplexity 9.67 # 2

Methods


No methods listed for this paper. Add relevant methods here