Search Results for author: James L. Sharpnack

Found 2 papers, 0 papers with code

Higher-Order Total Variation Classes on Grids: Minimax Theory and Trend Filtering Methods

no code implementations NeurIPS 2017 Veeranjaneyulu Sadhanala, Yu-Xiang Wang, James L. Sharpnack, Ryan J. Tibshirani

To move past this, we define two new higher-order TV classes, based on two ways of compiling the discrete derivatives of a parameter across the nodes.

A Sharp Error Analysis for the Fused Lasso, with Application to Approximate Changepoint Screening

no code implementations NeurIPS 2017 Kevin Lin, James L. Sharpnack, Alessandro Rinaldo, Ryan J. Tibshirani

In the 1-dimensional multiple changepoint detection problem, we derive a new fast error rate for the fused lasso estimator, under the assumption that the mean vector has a sparse number of changepoints.

Cannot find the paper you are looking for? You can Submit a new open access paper.