Search Results for author: Neha Gupta

Found 5 papers, 0 papers with code

Estimating decision tree learnability with polylogarithmic sample complexity

no code implementations NeurIPS 2020 Guy Blanc, Neha Gupta, Jane Lange, Li-Yang Tan

We show that top-down decision tree learning heuristics are amenable to highly efficient learnability estimation: for monotone target functions, the error of the decision tree hypothesis constructed by these heuristics can be estimated with polylogarithmically many labeled examples, exponentially smaller than the number necessary to run these heuristics, and indeed, exponentially smaller than information-theoretic minimum required to learn a good decision tree.

Universal guarantees for decision tree induction via a higher-order splitting criterion

no code implementations NeurIPS 2020 Guy Blanc, Neha Gupta, Jane Lange, Li-Yang Tan

We propose a simple extension of top-down decision tree learning heuristics such as ID3, C4. 5, and CART.

Active Local Learning

no code implementations31 Aug 2020 Arturs Backurs, Avrim Blum, Neha Gupta

In particular, the number of label queries should be independent of the complexity of $H$, and the function $h$ should be well-defined, independent of $x$.

Implicit regularization for deep neural networks driven by an Ornstein-Uhlenbeck like process

no code implementations19 Apr 2019 Guy Blanc, Neha Gupta, Gregory Valiant, Paul Valiant

We characterize the behavior of the training dynamics near any parameter vector that achieves zero training error, in terms of an implicit regularization term corresponding to the sum over the data points, of the squared $\ell_2$ norm of the gradient of the model with respect to the parameter vector, evaluated at each data point.

Exploiting Numerical Sparsity for Efficient Learning : Faster Eigenvector Computation and Regression

no code implementations NeurIPS 2018 Neha Gupta, Aaron Sidford

This running time improves upon the previous best unaccelerated running time of $\tilde{O}(nd + n L d / \mu)$.

Cannot find the paper you are looking for? You can Submit a new open access paper.