no code implementations • 5 Jul 2021 • Lin Chen, Hossein Esfandiari, Gang Fu, Vahab S. Mirrokni, Qian Yu
First, we show that it is not possible to provide an $n^{1/\log\log n}$-approximation algorithm for this problem unless the exponential time hypothesis fails.
no code implementations • NeurIPS 2019 • Lin Chen, Hossein Esfandiari, Thomas Fu, Vahab S. Mirrokni
In this paper, we aim to develop LSH schemes for distance functions that measure the distance between two probability distributions, particularly for f-divergences as well as a generalization to capture mutual information loss.
no code implementations • 30 Apr 2019 • Mohammadhossein Bateni, Lin Chen, Hossein Esfandiari, Thomas Fu, Vahab S. Mirrokni, Afshin Rostamizadeh
To achieve this, we introduce a novel re-parametrization of the mutual information objective, which we prove is submodular, and design a data structure to query the submodular function in amortized $O(\log n )$ time (where $n$ is the input vocabulary size).