Search Results for author: Chudi Zhong

Found 9 papers, 8 papers with code

OKRidge: Scalable Optimal k-Sparse Ridge Regression

1 code implementation NeurIPS 2023 Jiachang Liu, Sam Rosen, Chudi Zhong, Cynthia Rudin

We consider an important problem in scientific discovery, namely identifying sparse governing equations for nonlinear dynamical systems.

regression

Exploring and Interacting with the Set of Good Sparse Generalized Additive Models

1 code implementation NeurIPS 2023 Chudi Zhong, Zhi Chen, Jiachang Liu, Margo Seltzer, Cynthia Rudin

In real applications, interaction between machine learning models and domain experts is critical; however, the classical machine learning paradigm that usually produces only a single model does not facilitate such interaction.

Additive models

FasterRisk: Fast and Accurate Interpretable Risk Scores

1 code implementation12 Oct 2022 Jiachang Liu, Chudi Zhong, Boxuan Li, Margo Seltzer, Cynthia Rudin

Specifically, our approach produces a pool of almost-optimal sparse continuous solutions, each with a different support set, using a beam-search algorithm.

Exploring the Whole Rashomon Set of Sparse Decision Trees

2 code implementations16 Sep 2022 Rui Xin, Chudi Zhong, Zhi Chen, Takuya Takagi, Margo Seltzer, Cynthia Rudin

We show three applications of the Rashomon set: 1) it can be used to study variable importance for the set of almost-optimal trees (as opposed to a single tree), 2) the Rashomon set for accuracy enables enumeration of the Rashomon sets for balanced accuracy and F1-score, and 3) the Rashomon set for a full dataset can be used to produce Rashomon sets constructed with only subsets of the data set.

Fast Sparse Classification for Generalized Linear and Additive Models

2 code implementations23 Feb 2022 Jiachang Liu, Chudi Zhong, Margo Seltzer, Cynthia Rudin

For fast sparse logistic regression, our computational speed-up over other best-subset search techniques owes to linear and quadratic surrogate cuts for the logistic loss that allow us to efficiently screen features for elimination, as well as use of a priority queue that favors a more uniform exploration of features.

Additive models Classification

Fast Sparse Decision Tree Optimization via Reference Ensembles

3 code implementations1 Dec 2021 Hayden McTavish, Chudi Zhong, Reto Achermann, Ilias Karimalis, Jacques Chen, Cynthia Rudin, Margo Seltzer

We show that by using these guesses, we can reduce the run time by multiple orders of magnitude, while providing bounds on how far the resulting trees can deviate from the black box's accuracy and expressive power.

Interpretable Machine Learning

Generalized and Scalable Optimal Sparse Decision Trees

2 code implementations ICML 2020 Jimmy Lin, Chudi Zhong, Diane Hu, Cynthia Rudin, Margo Seltzer

Decision tree optimization is notoriously difficult from a computational perspective but essential for the field of interpretable machine learning.

Interpretable Machine Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.