1 code implementation • 7 Sep 2024 • L. Elisa Celis, Amit Kumar, Nisheeth K. Vishnoi, Andrew Xu
A central entity evaluates the utility of each candidate to the institutions, and the goal is to select candidates for each institution in a way that maximizes utility while also considering the candidates' preferences.
no code implementations • 6 Sep 2024 • Oren Mangoubi, Nisheeth K. Vishnoi
We consider the problem of sampling from a log-concave distribution $\pi(\theta) \propto e^{-f(\theta)}$ constrained to a polytope $K:=\{\theta \in \mathbb{R}^d: A\theta \leq b\}$, where $A\in \mathbb{R}^{m\times d}$ and $b \in \mathbb{R}^m$. The fastest-known algorithm \cite{mangoubi2022faster} for the setting when $f$ is $O(1)$-Lipschitz or $O(1)$-smooth runs in roughly $O(md \times md^{\omega -1})$ arithmetic operations, where the $md^{\omega -1}$ term arises because each Markov chain step requires computing a matrix inversion and determinant (here $\omega \approx 2. 37$ is the matrix multiplication constant).
1 code implementation • NeurIPS 2023 • L. Elisa Celis, Amit Kumar, Anay Mehrotra, Nisheeth K. Vishnoi
We characterize the distributions that arise from our model and study the effect of the parameters on the observed distribution.
no code implementations • 29 Jun 2023 • Oren Mangoubi, Nisheeth K. Vishnoi
We present and analyze a complex variant of the Gaussian mechanism and show that the Frobenius norm of the difference between the matrix output by this mechanism and the best rank-$k$ approximation to $M$ is bounded by roughly $\tilde{O}(\sqrt{kd})$, whenever there is an appropriately large gap between the $k$'th and the $k+1$'th eigenvalues of $M$.
1 code implementation • 16 Jun 2023 • Niclas Boehmer, L. Elisa Celis, Lingxiao Huang, Anay Mehrotra, Nisheeth K. Vishnoi
We consider the problem of subset selection where one is given multiple rankings of items and the goal is to select the highest ``quality'' subset.
1 code implementation • 3 May 2023 • Anay Mehrotra, Nisheeth K. Vishnoi
In empirical evaluation, with both synthetic and real-world data, we observe that this algorithm improves the utility of the output subset for this family of submodular functions over baselines.
1 code implementation • 30 Nov 2022 • Anay Mehrotra, Nisheeth K. Vishnoi
The fair-ranking problem, which asks to rank a given set of items to maximize utility subject to group fairness constraints, has received attention in the fairness, information retrieval, and machine learning literature.
no code implementations • 11 Nov 2022 • Oren Mangoubi, Nisheeth K. Vishnoi
These equations allow us to bound the utility as the square-root of a sum-of-squares of perturbations to the eigenvectors, as opposed to a sum of perturbation bounds obtained via Davis-Kahan-type theorems.
no code implementations • 6 Jul 2022 • Oren Mangoubi, Yikai Wu, Satyen Kale, Abhradeep Guha Thakurta, Nisheeth K. Vishnoi
Consider the following optimization problem: Given $n \times n$ matrices $A$ and $\Lambda$, maximize $\langle A, U\Lambda U^*\rangle$ where $U$ varies over the unitary group $\mathrm{U}(n)$.
no code implementations • 19 Jun 2022 • Oren Mangoubi, Nisheeth K. Vishnoi
Given a Lipschitz or smooth convex function $\, f:K \to \mathbb{R}$ for a bounded polytope $K \subseteq \mathbb{R}^d$ defined by $m$ inequalities, we consider the problem of sampling from the log-concave distribution $\pi(\theta) \propto e^{-f(\theta)}$ constrained to $K$.
no code implementations • 3 Feb 2022 • Anay Mehrotra, Bary S. R. Pradelski, Nisheeth K. Vishnoi
Interventions such as the Rooney Rule and its generalizations, which require the decision maker to select at least a specified number of individuals from each affected group, have been proposed to mitigate the adverse effects of implicit bias in selection.
no code implementations • 24 Nov 2021 • Hortense Fong, Vineet Kumar, Anay Mehrotra, Nisheeth K. Vishnoi
We evaluate fairAUC on synthetic and real-world datasets and find that it significantly improves AUC for the disadvantaged group relative to benchmarks maximizing overall AUC and minimizing bias between groups.
no code implementations • 7 Nov 2021 • Oren Mangoubi, Nisheeth K. Vishnoi
For a $d$-dimensional log-concave distribution $\pi(\theta) \propto e^{-f(\theta)}$ constrained to a convex body $K$, the problem of outputting samples from a distribution $\nu$ which is $\varepsilon$-close in infinity-distance $\sup_{\theta \in K} |\log \frac{\nu(\theta)}{\pi(\theta)}|$ to $\pi$ arises in differentially private optimization.
no code implementations • NeurIPS 2021 • Lingxiao Huang, K. Sudhir, Nisheeth K. Vishnoi
In particular, we consider the setting where the time series data on $N$ entities is generated from a Gaussian mixture model with autocorrelations over $k$ clusters in $\mathbb{R}^d$.
no code implementations • 2 Sep 2021 • Jonathan Leake, Nisheeth K. Vishnoi
In the last few years, the notion of symmetry has provided a powerful and essential lens to view several optimization or sampling problems that arise in areas such as theoretical computer science, statistics, machine learning, quantum inference, and privacy.
no code implementations • 27 Aug 2021 • Nisheeth K. Vishnoi
The goal of this article is to introduce the Hamiltonian Monte Carlo (HMC) method -- a Hamiltonian dynamics-inspired algorithm for sampling from a Gibbs density $\pi(x) \propto e^{-f(x)}$.
1 code implementation • NeurIPS 2021 • L. Elisa Celis, Anay Mehrotra, Nisheeth K. Vishnoi
Our main contribution is an optimization framework to learn fair classifiers in this adversarial setting that comes with provable guarantees on accuracy and fairness.
1 code implementation • NeurIPS 2020 • Lingxiao Huang, K. Sudhir, Nisheeth K. Vishnoi
We first define coresets for several variants of regression problems with panel data and then present efficient algorithms to construct coresets of size that depend polynomially on 1/$\varepsilon$ (where $\varepsilon$ is the error parameter) and the number of regression parameters - independent of the number of individuals in the panel data or the time units each individual is observed for.
1 code implementation • 21 Oct 2020 • L. Elisa Celis, Chris Hays, Anay Mehrotra, Nisheeth K. Vishnoi
Our main result is that, when the panel is constrained by the Rooney Rule, their implicit bias roughly reduces at a rate that is the inverse of the size of the shortlist--independent of the number of candidates, whereas without the Rooney Rule, the rate is inversely proportional to the number of candidates.
2 code implementations • 22 Jun 2020 • Vijay Keswani, Oren Mangoubi, Sushant Sachdeva, Nisheeth K. Vishnoi
The equilibrium point found by our algorithm depends on the proposal distribution, and when applying our algorithm to train GANs we choose the proposal distribution to be a distribution of stochastic gradients.
no code implementations • 22 Jun 2020 • Oren Mangoubi, Nisheeth K. Vishnoi
We propose an optimization model, the $\varepsilon$-greedy adversarial equilibrium, and show that it can serve as a computationally tractable alternative to the min-max optimization model.
1 code implementation • 8 Jun 2020 • L. Elisa Celis, Lingxiao Huang, Vijay Keswani, Nisheeth K. Vishnoi
We present an optimization framework for learning a fair classifier in the presence of noisy perturbations in the protected attributes.
no code implementations • 23 Jan 2020 • L. Elisa Celis, Anay Mehrotra, Nisheeth K. Vishnoi
Implicit bias is the unconscious attribution of particular qualities (or lack thereof) to a member from a particular social group (e. g., defined by gender or race).
1 code implementation • NeurIPS 2019 • Lingxiao Huang, Shaofeng H. -C. Jiang, Nisheeth K. Vishnoi
Our approach is based on novel constructions of coresets: for the $k$-median objective, we construct an $\varepsilon$-coreset of size $O(\Gamma k^2 \varepsilon^{-d})$ where $\Gamma$ is the number of distinct collections of groups that a point may belong to, and for the $k$-means objective, we show how to construct an $\varepsilon$-coreset of size $O(\Gamma k^3\varepsilon^{-d-1})$.
1 code implementation • ICML 2020 • L. Elisa Celis, Vijay Keswani, Nisheeth K. Vishnoi
Unlike prior work, it can efficiently learn distributions over large domains, controllably adjust the representation rates of protected groups and achieve target fairness metrics such as statistical parity, yet remains close to the empirical distribution induced by the given dataset.
no code implementations • 5 May 2019 • Oren Mangoubi, Nisheeth K. Vishnoi
We achieve this improvement by a novel method of computing polytope membership, where one avoids checking inequalities estimated to have a very low probability of being violated.
no code implementations • 22 Feb 2019 • Oren Mangoubi, Nisheeth K. Vishnoi
The Langevin Markov chain algorithms are widely deployed methods to sample from distributions in challenging high-dimensional and non-convex statistics and machine learning applications.
1 code implementation • 21 Feb 2019 • Lingxiao Huang, Nisheeth K. Vishnoi
Theoretically, we prove a stability guarantee, that was lacking in fair classification algorithms, and also provide an accuracy guarantee for our extended framework.
1 code implementation • NeurIPS 2019 • Holden Lee, Oren Mangoubi, Nisheeth K. Vishnoi
Given a sequence of convex functions $f_0, f_1, \ldots, f_T$, we study the problem of sampling from the Gibbs distribution $\pi_t \propto e^{-\sum_{k=0}^tf_k}$ for each epoch $t$ in an online manner.
1 code implementation • 29 Jan 2019 • L. Elisa Celis, Anay Mehrotra, Nisheeth K. Vishnoi
To prevent this, we propose a constrained ad auction framework that maximizes the platform's revenue conditioned on ensuring that the audience seeing an advertiser's ad is distributed appropriately across sensitive types such as gender or race.
no code implementations • 24 Jun 2018 • Sayash Kapoor, Vijay Keswani, Nisheeth K. Vishnoi, L. Elisa Celis
We present a prototype for a news search engine that presents balanced viewpoints across liberal and conservative articles with the goal of de-polarizing content and allowing users to escape their filter bubble.
no code implementations • 17 Jun 2018 • Nisheeth K. Vishnoi
We first present a variety of notions from differential and Riemannian geometry such as differentiation on manifolds, geodesics, and then introduce geodesic convexity.
4 code implementations • 15 Jun 2018 • L. Elisa Celis, Lingxiao Huang, Vijay Keswani, Nisheeth K. Vishnoi
The main contribution of this paper is a new meta-algorithm for classification that takes as input a large class of fairness constraints, with respect to multiple non-disjoint sensitive attributes, and which comes with provable guarantees.
no code implementations • NeurIPS 2018 • Oren Mangoubi, Nisheeth K. Vishnoi
Hamiltonian Monte Carlo (HMC) is a widely deployed method to sample from high-dimensional distributions in Statistics and Machine learning.
no code implementations • 23 Feb 2018 • L. Elisa Celis, Sayash Kapoor, Farnood Salehi, Nisheeth K. Vishnoi
Personalization is pervasive in the online space as it leads to higher efficiency and revenue by allowing the most relevant content to be served to each user.
1 code implementation • ICML 2018 • L. Elisa Celis, Vijay Keswani, Damian Straszak, Amit Deshpande, Tarun Kathuria, Nisheeth K. Vishnoi
Sampling methods that choose a subset of the data proportional to its diversity in the feature space are popular for data summarization.
no code implementations • 7 Nov 2017 • Oren Mangoubi, Nisheeth K. Vishnoi
In this paper we study the more general case when the noise has magnitude $\alpha F(x) + \beta$ for some $\alpha, \beta > 0$, and present a polynomial time algorithm that finds an approximate minimizer of $F$ for this noise model.
no code implementations • 6 Nov 2017 • Damian Straszak, Nisheeth K. Vishnoi
Our main result shows a ${\rm poly}(m, \log 1/\varepsilon)$ bound on the bit complexity of $\varepsilon$-optimal dual solutions to the maximum entropy convex program -- for very general support sets and with no restriction on the marginal vector.
1 code implementation • 27 Oct 2017 • L. Elisa Celis, Lingxiao Huang, Nisheeth K. Vishnoi
Multiwinner voting rules are used to select a small representative subset of candidates or items from a larger set given the preferences of voters.
no code implementations • 8 Aug 2017 • Damian Straszak, Nisheeth K. Vishnoi
While it is known that the stationary points of the Bethe approximation coincide with the fixed points of belief propagation, in general, the relation between the Bethe approximation and the partition function is not well understood.
no code implementations • 10 Jul 2017 • Javad B. Ebrahimi, Damian Straszak, Nisheeth K. Vishnoi
The volume of a set of vectors is used as a measure of their diversity, and partition or matroid constraints over $[m]$ are imposed in order to ensure resource or fairness constraints.
no code implementations • 7 Jul 2017 • L. Elisa Celis, Nisheeth K. Vishnoi
Personalization is pervasive in the online space as, when combined with learning, it leads to higher efficiency and revenue by allowing the most relevant content to be served to each user.
no code implementations • 8 May 2017 • L. Elisa Celis, Peter M. Krafft, Nisheeth K. Vishnoi
Finally, we observe that our infinite population dynamics is a stochastic variant of the classic multiplicative weights update (MWU) method.
2 code implementations • 22 Apr 2017 • L. Elisa Celis, Damian Straszak, Nisheeth K. Vishnoi
Ranking algorithms are deployed widely to order a set of items in applications such as search engines, news feeds, and recommendation systems.
no code implementations • 23 Oct 2016 • L. Elisa Celis, Amit Deshpande, Tarun Kathuria, Nisheeth K. Vishnoi
However, in doing so, a question that seems to be overlooked is whether it is possible to produce fair subsamples that are also adequately representative of the feature space of the data set - an important and classic requirement in machine learning.
no code implementations • 1 Aug 2016 • L. Elisa Celis, Amit Deshpande, Tarun Kathuria, Damian Straszak, Nisheeth K. Vishnoi
Consequently, we obtain a few algorithms of independent interest: 1) to count over the base polytope of regular matroids when there are additional (succinct) budget constraints and, 2) to evaluate and compute the mixed characteristic polynomials, that played a central role in the resolution of the Kadison-Singer problem, for certain special cases.
1 code implementation • 12 Jan 2016 • Damian Straszak, Nisheeth K. Vishnoi
In this paper we present a connection between two dynamical systems arising in entirely different contexts: one in signal processing and the other in biology.