Search Results for author: Mahdi Haghifam

Found 10 papers, 1 papers with code

Why Is Public Pretraining Necessary for Private Model Training?

no code implementations19 Feb 2023 Arun Ganesh, Mahdi Haghifam, Milad Nasr, Sewoong Oh, Thomas Steinke, Om Thakkar, Abhradeep Thakurta, Lun Wang

To explain this phenomenon, we hypothesize that the non-convex loss landscape of a model training necessitates an optimization algorithm to go through two phases.

Transfer Learning

Limitations of Information-Theoretic Generalization Bounds for Gradient Descent Methods in Stochastic Convex Optimization

no code implementations27 Dec 2022 Mahdi Haghifam, Borja Rodríguez-Gálvez, Ragnar Thobaben, Mikael Skoglund, Daniel M. Roy, Gintare Karolina Dziugaite

To date, no "information-theoretic" frameworks for reasoning about generalization error have been shown to establish minimax rates for gradient descent in the setting of stochastic convex optimization.

Generalization Bounds

Understanding Generalization via Leave-One-Out Conditional Mutual Information

no code implementations29 Jun 2022 Mahdi Haghifam, Shay Moran, Daniel M. Roy, Gintare Karolina Dziugaite

These leave-one-out variants of the conditional mutual information (CMI) of an algorithm (Steinke and Zakynthinou, 2020) are also seen to control the mean generalization error of learning algorithms with bounded loss functions.

Transductive Learning

Towards a Unified Information-Theoretic Framework for Generalization

no code implementations NeurIPS 2021 Mahdi Haghifam, Gintare Karolina Dziugaite, Shay Moran, Daniel M. Roy

We further show that an inherent limitation of proper learning of VC classes contradicts the existence of a proper learner with constant CMI, and it implies a negative resolution to an open problem of Steinke and Zakynthinou (2020).

Generalization Bounds

Information-Theoretic Generalization Bounds for Stochastic Gradient Descent

no code implementations1 Feb 2021 Gergely Neu, Gintare Karolina Dziugaite, Mahdi Haghifam, Daniel M. Roy

The key factors our bounds depend on are the variance of the gradients (with respect to the data distribution) and the local smoothness of the objective function along the SGD path, and the sensitivity of the loss function to perturbations to the final output.

Generalization Bounds Stochastic Optimization

On the Information Complexity of Proper Learners for VC Classes in the Realizable Case

no code implementations5 Nov 2020 Mahdi Haghifam, Gintare Karolina Dziugaite, Shay Moran, Daniel M. Roy

We provide a negative resolution to a conjecture of Steinke and Zakynthinou (2020a), by showing that their bound on the conditional mutual information (CMI) of proper learners of Vapnik--Chervonenkis (VC) classes cannot be improved from $d \log n +2$ to $O(d)$, where $n$ is the number of i. i. d.

Sharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms

no code implementations NeurIPS 2020 Mahdi Haghifam, Jeffrey Negrea, Ashish Khisti, Daniel M. Roy, Gintare Karolina Dziugaite

Finally, we apply these bounds to the study of Langevin dynamics algorithm, showing that conditioning on the super sample allows us to exploit information in the optimization trajectory to obtain tighter bounds based on hypothesis tests.

Generalization Bounds

Sequential Classification with Empirically Observed Statistics

no code implementations3 Dec 2019 Mahdi Haghifam, Vincent Y. F. Tan, Ashish Khisti

Motivated by real-world machine learning applications, we consider a statistical classification task in a sequential setting where test samples arrive sequentially.

Classification General Classification +1

Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates

1 code implementation NeurIPS 2019 Jeffrey Negrea, Mahdi Haghifam, Gintare Karolina Dziugaite, Ashish Khisti, Daniel M. Roy

In this work, we improve upon the stepwise analysis of noisy iterative learning algorithms initiated by Pensia, Jog, and Loh (2018) and recently extended by Bu, Zou, and Veeravalli (2019).

Generalization Bounds

Cannot find the paper you are looking for? You can Submit a new open access paper.