Search Results for author: Oren Melamud

Found 16 papers, 1 papers with code

Combining Unsupervised Pre-training and Annotator Rationales to Improve Low-shot Text Classification

no code implementations IJCNLP 2019 Oren Melamud, Mihaela Bornea, Ken Barker

In this work, we combine these two approaches to improve low-shot text classification with two novel methods: a simple bag-of-words embedding approach; and a more complex context-aware method, based on the BERT model.

General Classification text-classification +2

Self-Normalization Properties of Language Modeling

no code implementations COLING 2018 Jacob Goldberger, Oren Melamud

Self-normalizing discriminative models approximate the normalized probability of a class without having to compute the partition function.

Language Modelling

A Simple Language Model based on PMI Matrix Approximations

no code implementations EMNLP 2017 Oren Melamud, Ido Dagan, Jacob Goldberger

Specifically, we show that with minor modifications to word2vec's algorithm, we get principled language models that are closely related to the well-established Noise Contrastive Estimation (NCE) based language models.

Language Modelling Word Embeddings

Information-Theory Interpretation of the Skip-Gram Negative-Sampling Objective Function

no code implementations ACL 2017 Oren Melamud, Jacob Goldberger

In this paper we define a measure of dependency between two random variables, based on the Jensen-Shannon (JS) divergence between their joint distribution and the product of their marginal distributions.

Dependency Parsing Entity Extraction using GAN +1

PMI Matrix Approximations with Applications to Neural Language Modeling

no code implementations5 Sep 2016 Oren Melamud, Ido Dagan, Jacob Goldberger

The obtained language modeling is closely related to NCE language models but is based on a simplified objective function.

Language Modelling

The Negochat Corpus of Human-agent Negotiation Dialogues

no code implementations LREC 2016 Vasily Konovalov, Ron artstein, Oren Melamud, Ido Dagan

In this work, we introduce an annotated natural language human-agent dialogue corpus in the negotiation domain.

Natural Language Understanding

The Role of Context Types and Dimensionality in Learning Word Embeddings

no code implementations NAACL 2016 Oren Melamud, David McClosky, Siddharth Patwardhan, Mohit Bansal

We provide the first extensive evaluation of how using different types of context to learn skip-gram word embeddings affects performance on a wide range of intrinsic and extrinsic NLP tasks.

Learning Word Embeddings

Cannot find the paper you are looking for? You can Submit a new open access paper.