Search Results for author: Lingxiao Huang

Found 13 papers, 7 papers with code

Coresets for Clustering in Graphs of Bounded Treewidth

no code implementations ICML 2020 Daniel Baker, Vladimir Braverman, Lingxiao Huang, Shaofeng H. -C. Jiang, Robert Krauthgamer, Xuan Wu

We initiate the study of coresets for clustering in graph metrics, i. e., the shortest-path metric of edge-weighted graphs.

Data Visualization

Coresets for Time Series Clustering

no code implementations NeurIPS 2021 Lingxiao Huang, K. Sudhir, Nisheeth K. Vishnoi

In particular, we consider the setting where the time series data on $N$ entities is generated from a Gaussian mixture model with autocorrelations over $k$ clusters in $\mathbb{R}^d$.

Time Series Time Series Clustering

Dont Just Divide; Polarize and Conquer!

1 code implementation23 Feb 2021 Shivin Srivastava, Siddharth Bhatia, Lingxiao Huang, Lim Jun Heng, Kenji Kawaguchi, Vaibhav Rajan

In data containing heterogeneous subpopulations, classification performance benefits from incorporating the knowledge of cluster structure in the classifier.

Classification General Classification

Revocable Deep Reinforcement Learning with Affinity Regularization for Outlier-Robust Graph Matching

no code implementations16 Dec 2020 Chang Liu, Runzhong Wang, Zetian Jiang, Junchi Yan, Lingxiao Huang, Pinyan Lu

We propose a deep reinforcement learning (RL) based approach RGM for weighted graph matching, whose sequential node matching scheme naturally fits with the strategy for selective inlier matching against outliers, and supports seed graph matching.

Combinatorial Optimization Computer Vision +3

Coresets for Regressions with Panel Data

1 code implementation NeurIPS 2020 Lingxiao Huang, K. Sudhir, Nisheeth K. Vishnoi

We first define coresets for several variants of regression problems with panel data and then present efficient algorithms to construct coresets of size that depend polynomially on 1/$\varepsilon$ (where $\varepsilon$ is the error parameter) and the number of regression parameters - independent of the number of individuals in the panel data or the time units each individual is observed for.

Fair Classification with Noisy Protected Attributes: A Framework with Provable Guarantees

1 code implementation8 Jun 2020 L. Elisa Celis, Lingxiao Huang, Vijay Keswani, Nisheeth K. Vishnoi

We present an optimization framework for learning a fair classifier in the presence of noisy perturbations in the protected attributes.

Fairness General Classification

Coresets for Clustering with Fairness Constraints

1 code implementation NeurIPS 2019 Lingxiao Huang, Shaofeng H. -C. Jiang, Nisheeth K. Vishnoi

Our approach is based on novel constructions of coresets: for the $k$-median objective, we construct an $\varepsilon$-coreset of size $O(\Gamma k^2 \varepsilon^{-d})$ where $\Gamma$ is the number of distinct collections of groups that a point may belong to, and for the $k$-means objective, we show how to construct an $\varepsilon$-coreset of size $O(\Gamma k^3\varepsilon^{-d-1})$.


Stable and Fair Classification

1 code implementation21 Feb 2019 Lingxiao Huang, Nisheeth K. Vishnoi

Theoretically, we prove a stability guarantee, that was lacking in fair classification algorithms, and also provide an accuracy guarantee for our extended framework.

Classification Decision Making +2

Classification with Fairness Constraints: A Meta-Algorithm with Provable Guarantees

4 code implementations15 Jun 2018 L. Elisa Celis, Lingxiao Huang, Vijay Keswani, Nisheeth K. Vishnoi

The main contribution of this paper is a new meta-algorithm for classification that takes as input a large class of fairness constraints, with respect to multiple non-disjoint sensitive attributes, and which comes with provable guarantees.

Classification Fairness +1

Multiwinner Voting with Fairness Constraints

1 code implementation27 Oct 2017 L. Elisa Celis, Lingxiao Huang, Nisheeth K. Vishnoi

Multiwinner voting rules are used to select a small representative subset of candidates or items from a larger set given the preferences of voters.


SVM via Saddle Point Optimization: New Bounds and Distributed Algorithms

no code implementations20 May 2017 Yifei Jin, Lingxiao Huang, Jian Li

Our algorithms achieve $(1-\epsilon)$-approximations with running time $\tilde{O}(nd+n\sqrt{d / \epsilon})$ for both variants, where $n$ is the number of points and $d$ is the dimensionality.

Cannot find the paper you are looking for? You can Submit a new open access paper.