no code implementations • 2 Nov 2021 • Daria Reshetova, Yikun Bai, Xiugang Wu, Ayfer Ozgur
We show that the optimal generator can be learned to accuracy $\epsilon$ with $O(1/\epsilon^2)$ samples from the target distribution.
no code implementations • NeurIPS 2021 • Cem Kalkanli, Ayfer Ozgur
We show that Thompson sampling combined with an adaptive batching strategy can achieve a similar performance without knowing the time horizon $T$ of the problem and without having to carefully optimize the batch structure to achieve a target regret bound (i. e. problem dependent vs minimax regret) for a given $T$.
no code implementations • 1 Oct 2021 • Cem Kalkanli, Ayfer Ozgur
We study the asymptotic performance of the Thompson sampling algorithm in the batched multi-armed bandit setting where the time horizon $T$ is divided into batches, and the agent is not able to observe the rewards of her actions until the end of each batch.
no code implementations • 6 Mar 2021 • Chuan-Zheng Lee, Leighton Pate Barnes, Ayfer Ozgur
We study schemes and lower bounds for distributed minimax statistical estimation over a Gaussian multiple-access channel (MAC) under squared error loss, in a framework combining statistical estimation and wireless communication.
no code implementations • 11 Feb 2021 • Leighton Pate Barnes, Ayfer Ozgur
We consider the processing of statistical samples $X\sim P_\theta$ by a channel $p(y|x)$, and characterize how the statistical information from the samples for estimating the parameter $\theta\in\mathbb{R}^d$ can scale with the mutual information or capacity of the channel.
Information Theory Information Theory Statistics Theory Statistics Theory
no code implementations • 8 Nov 2020 • Cem Kalkanli, Ayfer Ozgur
Thompson sampling has been shown to be an effective policy across a variety of online learning tasks.
no code implementations • 24 Aug 2020 • Yikun Bai, Xiugang Wu, Ayfer Ozgur
Following Marton's approach, we show that the new transportation cost inequality can be used to recover old and new concentration of measure results.
no code implementations • 21 May 2020 • Surin Ahn, Ayfer Ozgur, Mert Pilanci
In the domains of dataset construction and crowdsourcing, a notable challenge is to aggregate labels from a heterogeneous set of labelers, each of whom is potentially an expert in some subset of tasks (and less reliable in others).
no code implementations • 21 May 2020 • Leighton Pate Barnes, Wei-Ning Chen, Ayfer Ozgur
We develop data processing inequalities that describe how Fisher information from statistical samples can scale with the privacy parameter $\varepsilon$ under local differential privacy constraints.
no code implementations • 21 May 2020 • Leighton Pate Barnes, Huseyin A. Inan, Berivan Isik, Ayfer Ozgur
The statistically optimal communication scheme arising from the analysis of this model leads to a new sparsification technique for SGD, which concatenates random-k and top-k, considered separately in the prior literature.
no code implementations • 7 Feb 2019 • Leighton Pate Barnes, Yanjun Han, Ayfer Ozgur
We consider the problem of learning high-dimensional, nonparametric and structured (e. g. Gaussian) distributions in distributed networks, where each node in the network observes an independent sample from the underlying distribution and can use $k$ bits to communicate its sample to a central processor.