no code implementations • ICML 2020 • Michal Moshkovitz, Sanjoy Dasgupta, Cyrus Rashtchian, Nave Frost
In terms of negative results, we show that popular top-down decision tree algorithms may lead to clusterings with arbitrarily large cost, and we prove that any explainable clustering must incur an \Omega(\log k) approximation compared to the optimal clustering.
no code implementations • 1 Feb 2022 • Sanjoy Dasgupta, Nave Frost, Michal Moshkovitz
We study the faithfulness of an explanation system to the underlying prediction model.
no code implementations • 19 Oct 2020 • Ilya Valmianski, Nave Frost, Navdeep Sood, Yang Wang, Baodong Liu, James J. Zhu, Sunil Karumuri, Ian M. Finn, Daniel S. Zisook
Symptom checkers have emerged as an important tool for collecting symptoms and diagnosing patients, minimizing the involvement of clinical personnel.
2 code implementations • 3 Jun 2020 • Nave Frost, Michal Moshkovitz, Cyrus Rashtchian
To allow flexibility, we develop a new explainable $k$-means clustering algorithm, ExKMC, that takes an additional parameter $k' \geq k$ and outputs a decision tree with $k'$ leaves.
3 code implementations • 28 Feb 2020 • Sanjoy Dasgupta, Nave Frost, Michal Moshkovitz, Cyrus Rashtchian
In terms of negative results, we show, first, that popular top-down decision tree algorithms may lead to clusterings with arbitrarily large cost, and second, that any tree-induced clustering must in general incur an $\Omega(\log k)$ approximation factor compared to the optimal clustering.