no code implementations • NeurIPS 2021 • Reda Ouhamma, Odalric Maillard, Vianney Perchet
We consider the problem of online linear regression in the stochastic setting.
no code implementations • NeurIPS 2010 • Mohammad Ghavamzadeh, Alessandro Lazaric, Odalric Maillard, Rémi Munos
We provide a thorough theoretical analysis of the LSTD with random projections and derive performance bounds for the resulting algorithm.
no code implementations • NeurIPS 2010 • Odalric Maillard, Rémi Munos
We consider least-squares regression using a randomly generated subspace G_P\subset F of finite dimension P, where F is a function space of infinite dimension, e. g.~L_2([0, 1]^d).
no code implementations • NeurIPS 2009 • Odalric Maillard, Rémi Munos
We consider the problem of learning, from K input data, a regression function in a function space of high dimension N using projections onto a random subspace of lower dimension M. From any linear approximation algorithm using empirical risk minimization (possibly penalized), we provide bounds on the excess risk of the estimate computed in the projected subspace (compressed domain) in terms of the excess risk of the estimate built in the high-dimensional space (initial domain).