Search Results for author: Youssef Oualil

Found 8 papers, 1 papers with code

Connecting and Comparing Language Model Interpolation Techniques

no code implementations26 Aug 2019 Ernest Pusateri, Christophe Van Gysel, Rami Botros, Sameer Badaskar, Mirko Hannemann, Youssef Oualil, Ilya Oparin

In this work, we uncover a theoretical connection between two language model interpolation techniques, count merging and Bayesian interpolation.

Language Modelling

A Neural Network Approach for Mixing Language Models

no code implementations23 Aug 2017 Youssef Oualil, Dietrich Klakow

The performance of Neural Network (NN)-based language models is steadily improving due to the emergence of new architectures, which are able to learn different natural language characteristics.

Text Compression

Long-Short Range Context Neural Networks for Language Modeling

no code implementations EMNLP 2016 Youssef Oualil, Mittul Singh, Clayton Greenberg, Dietrich Klakow

The goal of language modeling techniques is to capture the statistical and structural properties of natural languages from training corpora.

Language Modelling Text Compression

A Batch Noise Contrastive Estimation Approach for Training Large Vocabulary Language Models

1 code implementation20 Aug 2017 Youssef Oualil, Dietrich Klakow

Training large vocabulary Neural Network Language Models (NNLMs) is a difficult task due to the explicit requirement of the output layer normalization, which typically involves the evaluation of the full softmax function over the complete vocabulary.

Text Compression

Sequential Recurrent Neural Networks for Language Modeling

no code implementations23 Mar 2017 Youssef Oualil, Clayton Greenberg, Mittul Singh, Dietrich Klakow

Feedforward Neural Network (FNN)-based language models estimate the probability of the next word based on the history of the last N words, whereas Recurrent Neural Networks (RNN) perform the same task based only on the last word and some context information that cycles in the network.

Language Modelling Text Compression

Sub-Word Similarity based Search for Embeddings: Inducing Rare-Word Embeddings for Word Similarity Tasks and Language Modelling

no code implementations COLING 2016 Mittul Singh, Clayton Greenberg, Youssef Oualil, Dietrich Klakow

We augmented pre-trained word embeddings with these novel embeddings and evaluated on a rare word similarity task, obtaining up to 3 times improvement in correlation over the original set of embeddings.

Language Modelling Morphological Analysis +2

Cannot find the paper you are looking for? You can Submit a new open access paper.