1 code implementation • 16 Jul 2024 • Zhengxin Zhang, Ziv Goldfeld, Kristjan Greenewald, Youssef Mroueh, Bharath K. Sriperumbudur

Motivated by scenarios where the global structure of the data needs to be preserved, this work initiates the study of gradient flows and Riemannian structure in the Gromov-Wasserstein (GW) geometry, which is particularly suited for such purposes.

no code implementations • 10 Jun 2024 • Sloan Nietert, Ziv Goldfeld, Soroosh Shafiee

We characterize the optimal population-limit risk for this task and then develop an efficient finite-sample algorithm with error bounded by $\sqrt{\varepsilon k} + \rho + \tilde{O}(d\sqrt{k}n^{-1/(k \lor 2)})$ when $P$ has bounded covariance.

no code implementations • 4 Apr 2024 • Haiyun He, Christina Lee Yu, Ziv Goldfeld

This enables refining our generalization bounds to capture the contraction as a function of the network architecture parameters.

no code implementations • 3 Jul 2023 • Ziv Goldfeld, Dhrumil Patel, Sreejith Sreekumar, Mark M. Wilde

Entropy measures quantify the amount of information and correlation present in a quantum system.

no code implementations • 22 Jun 2023 • Theshani Nuradha, Ziv Goldfeld, Mark M. Wilde

We propose a versatile privacy framework for quantum systems, termed quantum pufferfish privacy (QPP).

1 code implementation • 2 Feb 2023 • Sloan Nietert, Rachel Cummings, Ziv Goldfeld

We study the problem of robust distribution estimation under the Wasserstein metric, a popular discrepancy measure between probability distributions rooted in optimal transport (OT) theory.

1 code implementation • 2 Jan 2023 • Dor Tsur, Ziv Aharoni, Ziv Goldfeld, Haim Permuter

Directed information (DI) is a fundamental measure for the study and analysis of sequential stochastic models.

1 code implementation • 17 Oct 2022 • Sloan Nietert, Ritwik Sadhu, Ziv Goldfeld, Kengo Kato

The goal of this work is to quantify this scalability from three key aspects: (i) empirical convergence rates; (ii) robustness to data contamination; and (iii) efficient computational methods.

no code implementations • 17 Jun 2022 • Ziv Goldfeld, Kristjan Greenewald, Theshani Nuradha, Galen Reeves

However, a quantitative characterization of how SMI itself and estimation rates thereof depend on the ambient dimension, which is crucial to the understanding of scalability, remain obscure.

no code implementations • 22 Nov 2021 • Zhengxin Zhang, Youssef Mroueh, Ziv Goldfeld, Bharath K. Sriperumbudur

Discrepancy measures between probability distributions are at the core of statistical inference and machine learning.

1 code implementation • 2 Nov 2021 • Sloan Nietert, Rachel Cummings, Ziv Goldfeld

The Wasserstein distance, rooted in optimal transport (OT) theory, is a popular discrepancy measure between probability distributions with various applications to statistics and machine learning.

no code implementations • 7 Oct 2021 • Sreejith Sreekumar, Ziv Goldfeld

Statistical divergences (SDs), which quantify the dissimilarity between probability distributions, are a basic constituent of statistical inference and machine learning.

no code implementations • 28 Jul 2021 • Ritwik Sadhu, Ziv Goldfeld, Kengo Kato

This result is then used to derive new empirical convergence rates for classic $W_1$ in terms of the intrinsic dimension.

no code implementations • 11 Mar 2021 • Sreejith Sreekumar, Zhengxin Zhang, Ziv Goldfeld

Statistical distances (SDs), which quantify the dissimilarity between probability distributions, are central to machine learning and statistics.

no code implementations • 11 Jan 2021 • Sloan Nietert, Ziv Goldfeld, Kengo Kato

Discrepancy measures between probability distributions, often termed statistical distances, are ubiquitous in probability theory, statistics and machine learning.

no code implementations • 30 Apr 2020 • Ziv Goldfeld, Yury Polyanskiy

The information bottleneck (IB) theory recently emerged as a bold information-theoretic paradigm for analyzing DL systems.

1 code implementation • 9 Mar 2020 • Ziv Aharoni, Dor Tsur, Ziv Goldfeld, Haim Henry Permuter

When no analytic solution is present or the channel model is unknown, there is no unified framework for calculating or even approximating capacity.

no code implementations • ICLR 2019 • Ziv Goldfeld, Ewout van den Berg, Kristjan Greenewald, Brian Kingsbury, Igor Melnyk, Nam Nguyen, Yury Polyanskiy

We then develop a rigorous estimator for I(X;T) in noisy DNNs and observe compression in various models.

no code implementations • 12 Oct 2018 • Ziv Goldfeld, Ewout van den Berg, Kristjan Greenewald, Igor Melnyk, Nam Nguyen, Brian Kingsbury, Yury Polyanskiy

We then develop a rigorous estimator for $I(X;T)$ in noisy DNNs and observe compression in various models.

no code implementations • 8 May 2018 • Ziv Goldfeld, Guy Bresler, Yury Polyanskiy

We first show that at zero temperature, order of $\sqrt{n}$ bits can be stored in the system indefinitely by coding over stable, striped configurations.

Information Theory Statistical Mechanics Information Theory

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.