Search Results for author: Clayton D. Scott

Found 8 papers, 2 papers with code

On Classification-Calibration of Gamma-Phi Losses

no code implementations14 Feb 2023 Yutong Wang, Clayton D. Scott

Gamma-Phi losses constitute a family of multiclass classification loss functions that generalize the logistic and other common losses, and have found application in the boosting literature.

Classification

Consistent Interpolating Ensembles via the Manifold-Hilbert Kernel

no code implementations19 May 2022 Yutong Wang, Clayton D. Scott

Recent research in the theory of overparametrized learning has sought to establish generalization guarantees in the interpolating regime.

VC dimension of partially quantized neural networks in the overparametrized regime

1 code implementation ICLR 2022 Yutong Wang, Clayton D. Scott

Indeed, existing applications of VC theory to large networks obtain upper bounds on VC dimension that are proportional to the number of weights, and for a large class of networks, these upper bound are known to be tight.

An Exact Solver for the Weston-Watkins SVM Subproblem

1 code implementation10 Feb 2021 Yutong Wang, Clayton D. Scott

Recent empirical evidence suggests that the Weston-Watkins support vector machine is among the best performing multiclass extensions of the binary SVM.

Weston-Watkins Hinge Loss and Ordered Partitions

no code implementations NeurIPS 2020 Yutong Wang, Clayton D. Scott

A recent empirical comparison of nine such formulations [Do\v{g}an et al. 2016] recommends the variant proposed by Weston and Watkins (WW), despite the fact that the WW-hinge loss is not calibrated with respect to the 0-1 loss.

An Operator Theoretic Approach to Nonparametric Mixture Models

no code implementations30 Jun 2016 Robert A. Vandermeulen, Clayton D. Scott

In this work, we make no distributional assumptions on the mixture components and instead assume that observations from the mixture model are grouped, such that observations in the same group are known to be drawn from the same mixture component.

On The Identifiability of Mixture Models from Grouped Samples

no code implementations23 Feb 2015 Robert A. Vandermeulen, Clayton D. Scott

In such models it is assumed that data are drawn from random probability measures, called mixture components, which are themselves drawn from a probability measure P over probability measures.

Robust Kernel Density Estimation by Scaling and Projection in Hilbert Space

no code implementations NeurIPS 2014 Robert A. Vandermeulen, Clayton D. Scott

As with other estimators, a robust version of the KDE is useful since sample contamination is a common issue with datasets.

Density Estimation

Cannot find the paper you are looking for? You can Submit a new open access paper.