Search Results for author: Ayfer Ozgur

Found 11 papers, 0 papers with code

Understanding Entropic Regularization in GANs

no code implementations2 Nov 2021 Daria Reshetova, Yikun Bai, Xiugang Wu, Ayfer Ozgur

We show that the optimal generator can be learned to accuracy $\epsilon$ with $O(1/\epsilon^2)$ samples from the target distribution.

Batched Thompson Sampling

no code implementations NeurIPS 2021 Cem Kalkanli, Ayfer Ozgur

We show that Thompson sampling combined with an adaptive batching strategy can achieve a similar performance without knowing the time horizon $T$ of the problem and without having to carefully optimize the batch structure to achieve a target regret bound (i. e. problem dependent vs minimax regret) for a given $T$.

Multi-Armed Bandits Thompson Sampling

Asymptotic Performance of Thompson Sampling in the Batched Multi-Armed Bandits

no code implementations1 Oct 2021 Cem Kalkanli, Ayfer Ozgur

We study the asymptotic performance of the Thompson sampling algorithm in the batched multi-armed bandit setting where the time horizon $T$ is divided into batches, and the agent is not able to observe the rewards of her actions until the end of each batch.

Multi-Armed Bandits Thompson Sampling

Over-the-Air Statistical Estimation

no code implementations6 Mar 2021 Chuan-Zheng Lee, Leighton Pate Barnes, Ayfer Ozgur

We study schemes and lower bounds for distributed minimax statistical estimation over a Gaussian multiple-access channel (MAC) under squared error loss, in a framework combining statistical estimation and wireless communication.

Fisher Information and Mutual Information Constraints

no code implementations11 Feb 2021 Leighton Pate Barnes, Ayfer Ozgur

We consider the processing of statistical samples $X\sim P_\theta$ by a channel $p(y|x)$, and characterize how the statistical information from the samples for estimating the parameter $\theta\in\mathbb{R}^d$ can scale with the mutual information or capacity of the channel.

Information Theory Information Theory Statistics Theory Statistics Theory

Asymptotic Convergence of Thompson Sampling

no code implementations8 Nov 2020 Cem Kalkanli, Ayfer Ozgur

Thompson sampling has been shown to be an effective policy across a variety of online learning tasks.

Multi-Armed Bandits Thompson Sampling

Information Constrained Optimal Transport: From Talagrand, to Marton, to Cover

no code implementations24 Aug 2020 Yikun Bai, Xiugang Wu, Ayfer Ozgur

Following Marton's approach, we show that the new transportation cost inequality can be used to recover old and new concentration of measure results.

Global Multiclass Classification and Dataset Construction via Heterogeneous Local Experts

no code implementations21 May 2020 Surin Ahn, Ayfer Ozgur, Mert Pilanci

In the domains of dataset construction and crowdsourcing, a notable challenge is to aggregate labels from a heterogeneous set of labelers, each of whom is potentially an expert in some subset of tasks (and less reliable in others).

Classification Federated Learning +1

Fisher information under local differential privacy

no code implementations21 May 2020 Leighton Pate Barnes, Wei-Ning Chen, Ayfer Ozgur

We develop data processing inequalities that describe how Fisher information from statistical samples can scale with the privacy parameter $\varepsilon$ under local differential privacy constraints.

valid

rTop-k: A Statistical Estimation Approach to Distributed SGD

no code implementations21 May 2020 Leighton Pate Barnes, Huseyin A. Inan, Berivan Isik, Ayfer Ozgur

The statistically optimal communication scheme arising from the analysis of this model leads to a new sparsification technique for SGD, which concatenates random-k and top-k, considered separately in the prior literature.

Lower Bounds for Learning Distributions under Communication Constraints via Fisher Information

no code implementations7 Feb 2019 Leighton Pate Barnes, Yanjun Han, Ayfer Ozgur

We consider the problem of learning high-dimensional, nonparametric and structured (e. g. Gaussian) distributions in distributed networks, where each node in the network observes an independent sample from the underlying distribution and can use $k$ bits to communicate its sample to a central processor.

Cannot find the paper you are looking for? You can Submit a new open access paper.