Search Results for author: Mike Gartrell

Found 18 papers, 7 papers with code

Unifying GANs and Score-Based Diffusion as Generative Particle Models

1 code implementation NeurIPS 2023 Jean-Yves Franceschi, Mike Gartrell, Ludovic Dos Santos, Thibaut Issenhuth, Emmanuel de Bézenac, Mickaël Chen, Alain Rakotomamonjy

Particle-based deep generative models, such as gradient flows and score-based diffusion models, have recently gained traction thanks to their striking performance.

Scalable MCMC Sampling for Nonsymmetric Determinantal Point Processes

1 code implementation1 Jul 2022 Insu Han, Mike Gartrell, Elvis Dohmatob, Amin Karbasi

In this work, we develop a scalable MCMC sampling algorithm for $k$-NDPPs with low-rank kernels, thus enabling runtime that is sublinear in $n$.

Point Processes

Combining Reward and Rank Signals for Slate Recommendation

no code implementations26 Jul 2021 Imad Aouali, Sergey Ivanov, Mike Gartrell, David Rohde, Flavian vasile, Victor Zaytsev, Diego Legrand

In this paper, we formulate several Bayesian models that incorporate the reward signal (Reward model), the rank signal (Rank model), or both (Full model), for non-personalized slate recommendation.

Recommendation Systems

Wasserstein Learning of Determinantal Point Processes

no code implementations NeurIPS Workshop LMCA 2020 Lucas Anquetil, Mike Gartrell, Alain Rakotomamonjy, Ugo Tanielian, Clément Calauzènes

Through an evaluation on a real-world dataset, we show that our Wasserstein learning approach provides significantly improved predictive performance on a generative task compared to DPPs trained using MLE.

Point Processes

Scalable Learning and MAP Inference for Nonsymmetric Determinantal Point Processes

2 code implementations ICLR 2021 Mike Gartrell, Insu Han, Elvis Dohmatob, Jennifer Gillenwater, Victor-Emmanuel Brunel

Determinantal point processes (DPPs) have attracted significant attention in machine learning for their ability to model subsets drawn from a large item collection.

Point Processes

Embedding models for recommendation under contextual constraints

no code implementations21 Jun 2019 Syrine Krichene, Mike Gartrell, Clement Calauzenes

For example, applying constraints a posteriori can result in incomplete recommendations or low-quality results for the tail of the distribution (i. e., less popular items).

Recommendation Systems Retrieval

Learning Nonsymmetric Determinantal Point Processes

1 code implementation NeurIPS 2019 Mike Gartrell, Victor-Emmanuel Brunel, Elvis Dohmatob, Syrine Krichene

Our method imposes a particular decomposition of the nonsymmetric kernel that enables such tractable learning algorithms, which we analyze both theoretically and experimentally.

Information Retrieval Point Processes +2

Partially Mutual Exclusive Softmax for Positive and Unlabeled data

no code implementations ICLR 2019 Ugo Tanielian, Flavian vasile, Mike Gartrell

This is often the case for applications such as language modeling, next event prediction and matrix factorization, where many of the potential outcomes are not mutually exclusive, but are more likely to be independent conditionally on the state.

Language Modelling

GEVR: An Event Venue Recommendation System for Groups of Mobile Users

no code implementations25 Mar 2019 Jason Shuo Zhang, Mike Gartrell, Richard Han, Qin Lv, Shivakant Mishra

In this paper, we present GEVR, the first Group Event Venue Recommendation system that incorporates mobility via individual location traces and context information into a "social-based" group decision model to provide venue recommendations for groups of mobile users.

Deep Determinantal Point Processes

no code implementations17 Nov 2018 Mike Gartrell, Elvis Dohmatob, Jon Alberdi

While DPPs have substantial expressive power, they are fundamentally limited by the parameterization of the kernel matrix and their inability to capture nonlinear interactions between items within sets.

Point Processes

Multi-Task Determinantal Point Processes for Recommendation

no code implementations24 May 2018 Romain Warlop, Jérémie Mary, Mike Gartrell

Determinantal point processes (DPPs) have received significant attention in the recent years as an elegant model for a variety of machine learning tasks, due to their ability to elegantly model set diversity and item quality or popularity.

General Classification Multi-class Classification +2

Adversarial Training of Word2Vec for Basket Completion

no code implementations22 May 2018 Ugo Tanielian, Mike Gartrell, Flavian vasile

In recent years, the Word2Vec model trained with the Negative Sampling loss function has shown state-of-the-art results in a number of machine learning tasks, including language modeling tasks, such as word analogy and word similarity, and in recommendation tasks, through Prod2Vec, an extension that applies to modeling user shopping activity and user preferences.

Language Modelling Word Similarity

Learning Determinantal Point Processes by Corrective Negative Sampling

no code implementations15 Feb 2018 Zelda Mariet, Mike Gartrell, Suvrit Sra

To address this issue, which reduces the quality of the learned model, we introduce a novel optimization problem, Contrastive Estimation (CE), which encodes information about "negative" samples into the basic learning model.

Language Modelling Point Processes

The Bayesian Low-Rank Determinantal Point Process Mixture Model

no code implementations15 Aug 2016 Mike Gartrell, Ulrich Paquet, Noam Koenigstein

Determinantal point processes (DPPs) are an elegant model for encoding probabilities over subsets, such as shopping baskets, of a ground set, such as an item catalog.

Point Processes Product Recommendation

Low-Rank Factorization of Determinantal Point Processes for Recommendation

1 code implementation17 Feb 2016 Mike Gartrell, Ulrich Paquet, Noam Koenigstein

In this work we present a new method for learning the DPP kernel from observed data using a low-rank factorization of this kernel.

Point Processes Product Recommendation

Cannot find the paper you are looking for? You can Submit a new open access paper.