no code implementations • EMNLP 2016 • Youssef Oualil, Mittul Singh, Clayton Greenberg, Dietrich Klakow
The goal of language modeling techniques is to capture the statistical and structural properties of natural languages from training corpora.
no code implementations • 23 Mar 2017 • Youssef Oualil, Clayton Greenberg, Mittul Singh, Dietrich Klakow
Feedforward Neural Network (FNN)-based language models estimate the probability of the next word based on the history of the last N words, whereas Recurrent Neural Networks (RNN) perform the same task based only on the last word and some context information that cycles in the network.
no code implementations • COLING 2016 • Mittul Singh, Clayton Greenberg, Youssef Oualil, Dietrich Klakow
We augmented pre-trained word embeddings with these novel embeddings and evaluated on a rare word similarity task, obtaining up to 3 times improvement in correlation over the original set of embeddings.