Search Results for author: Olivier Chapelle

Found 8 papers, 3 papers with code

Field-aware Factorization Machines in a Real-world Online Advertising System

4 code implementations15 Jan 2017 Yuchin Juan, Damien Lefortier, Olivier Chapelle

Predicting user response is one of the core machine learning tasks in computational advertising.

Cost-sensitive Learning for Utility Optimization in Online Advertising Auctions

no code implementations11 Mar 2016 Flavian Vasile, Damien Lefortier, Olivier Chapelle

One of the most challenging problems in computational advertising is the prediction of click-through and conversion rates for bidding in online advertising auctions.

An Empirical Evaluation of Thompson Sampling

no code implementations NeurIPS 2011 Olivier Chapelle, Lihong Li

Thompson sampling is one of oldest heuristic to address the exploration / exploitation trade-off, but it is surprisingly not very popular in the literature.

A Reliable Effective Terascale Linear Learning System

2 code implementations19 Oct 2011 Alekh Agarwal, Olivier Chapelle, Miroslav Dudik, John Langford

We present a system and a set of techniques for learning linear predictors with convex losses on terascale datasets, with trillions of features, {The number of features here refers to the number of non-zero entries in the data matrix.}

Improved Preconditioner for Hessian Free Optimization

1 code implementation NIPS 2010 Olivier Chapelle, Dumitru Erhan

One of the critical components in that algorithm is the choice of the preconditioner.

Large Margin Taxonomy Embedding for Document Categorization

no code implementations NeurIPS 2008 Kilian Q. Weinberger, Olivier Chapelle

The optimization of the semantic space incorporates large margin constraints that ensure that for each instance the correct class prototype is closer than any other.

Classification General Classification +1

A General Boosting Method and its Application to Learning Ranking Functions for Web Search

no code implementations NeurIPS 2007 Zhaohui Zheng, Hongyuan Zha, Tong Zhang, Olivier Chapelle, Keke Chen, Gordon Sun

We present a general boosting method extending functional gradient boosting to optimize complex loss functions that are encountered in many machine learning problems.

Cannot find the paper you are looking for? You can Submit a new open access paper.