Search Results for author: Dávid Pál

Found 9 papers, 1 papers with code

Parameter-free Stochastic Optimization of Variationally Coherent Functions

no code implementations30 Jan 2021 Francesco Orabona, Dávid Pál

We design and analyze an algorithm for first-order stochastic optimization of a large class of functions on $\mathbb{R}^d$.

Stochastic Optimization

Bandit Multiclass Linear Classification: Efficient Algorithms for the Separable Case

no code implementations6 Feb 2019 Alina Beygelzimer, Dávid Pál, Balázs Szörényi, Devanathan Thiruvenkatachari, Chen-Yu Wei, Chicheng Zhang

Under the more challenging weak linear separability condition, we design an efficient algorithm with a mistake bound of $\min (2^{\widetilde{O}(K \log^2 (1/\gamma))}, 2^{\widetilde{O}(\sqrt{1/\gamma} \log K)})$.

Classification General Classification

Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP

no code implementations ICML 2017 Satyen Kale, Zohar Karnin, Tengyuan Liang, Dávid Pál

Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs the squared loss.

feature selection regression

Coin Betting and Parameter-Free Online Learning

1 code implementation NeurIPS 2016 Francesco Orabona, Dávid Pál

We present a new intuitive framework to design parameter-free algorithms for \emph{both} online linear optimization over Hilbert spaces and for learning with expert advice, based on reductions to betting on outcomes of adversarial coins.

Scale-Free Online Learning

no code implementations8 Jan 2016 Francesco Orabona, Dávid Pál

We design and analyze algorithms for online linear optimization that have optimal regret and at the same time do not need to know any upper or lower bounds on the norm of the loss vectors.

Hardness of Online Sleeping Combinatorial Optimization Problems

no code implementations NeurIPS 2016 Satyen Kale, Chansoo Lee, Dávid Pál

We show that several online combinatorial optimization problems that admit efficient no-regret algorithms become computationally hard in the sleeping setting where a subset of actions becomes unavailable in each round.

Combinatorial Optimization PAC learning

Improved Algorithms for Linear Stochastic Bandits

no code implementations NeurIPS 2011 Yasin Abbasi-Yadkori, Dávid Pál, Csaba Szepesvári

We improve the theoretical analysis and empirical performance of algorithms for the stochastic multi-armed bandit problem and the linear stochastic multi-armed bandit problem.

Estimation of Rényi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

no code implementations NeurIPS 2010 Dávid Pál, Barnabás Póczos, Csaba Szepesvári

We present simple and computationally efficient nonparametric estimators of R\'enyi entropy and mutual information based on an i. i. d.

Cannot find the paper you are looking for? You can Submit a new open access paper.