Search Results for author: Ji-Ping Wang

Found 3 papers, 2 papers with code

Using Large Corpus N-gram Statistics to Improve Recurrent Neural Language Models

no code implementations NAACL 2019 Yiben Yang, Ji-Ping Wang, Doug Downey

Recurrent neural network language models (RNNLM) form a valuable foundation for many NLP systems, but training the models can be computationally expensive, and may take days to train on a large corpus.

Cannot find the paper you are looking for? You can Submit a new open access paper.