Search Results for author: Puyu Wang

Found 8 papers, 1 papers with code

Generalization Guarantees of Gradient Descent for Multi-Layer Neural Networks

no code implementations26 May 2023 Puyu Wang, Yunwen Lei, Di Wang, Yiming Ying, Ding-Xuan Zhou

This sheds light on sufficient or necessary conditions for under-parameterized and over-parameterized NNs trained by GD to attain the desired risk rate of $O(1/\sqrt{n})$.

Stability and Generalization for Markov Chain Stochastic Gradient Methods

no code implementations16 Sep 2022 Puyu Wang, Yunwen Lei, Yiming Ying, Ding-Xuan Zhou

To the best of our knowledge, this is the first generalization analysis of SGMs when the gradients are sampled from a Markov process.

Generalization Bounds Learning Theory

Differentially Private Stochastic Gradient Descent with Low-Noise

no code implementations9 Sep 2022 Puyu Wang, Yunwen Lei, Yiming Ying, Ding-Xuan Zhou

In this paper, we focus on the privacy and utility (measured by excess risk bounds) performances of differentially private stochastic gradient descent (SGD) algorithms in the setting of stochastic convex optimization.

Privacy Preserving

Simple Stochastic and Online Gradient Descent Algorithms for Pairwise Learning

no code implementations NeurIPS 2021 Zhenhuan Yang, Yunwen Lei, Puyu Wang, Tianbao Yang, Yiming Ying

A popular approach to handle streaming data in pairwise learning is an online gradient descent (OGD) algorithm, where one needs to pair the current instance with a buffering set of previous instances with a sufficiently large size and therefore suffers from a scalability issue.

Generalization Bounds Metric Learning +1

Simple Stochastic and Online Gradient DescentAlgorithms for Pairwise Learning

1 code implementation23 Nov 2021 Zhenhuan Yang, Yunwen Lei, Puyu Wang, Tianbao Yang, Yiming Ying

A popular approach to handle streaming data in pairwise learning is an online gradient descent (OGD) algorithm, where one needs to pair the current instance with a buffering set of previous instances with a sufficiently large size and therefore suffers from a scalability issue.

Generalization Bounds Metric Learning +1

Stability and Generalization for Randomized Coordinate Descent

no code implementations17 Aug 2021 Puyu Wang, Liang Wu, Yunwen Lei

Randomized coordinate descent (RCD) is a popular optimization algorithm with wide applications in solving various machine learning problems, which motivates a lot of theoretical analysis on its convergence behavior.

Generalization Bounds

Differentially Private SGD with Non-Smooth Losses

no code implementations22 Jan 2021 Puyu Wang, Yunwen Lei, Yiming Ying, Hai Zhang

We significantly relax these restrictive assumptions and establish privacy and generalization (utility) guarantees for private SGD algorithms using output and gradient perturbations associated with non-smooth convex losses.

Differential Privacy for Sparse Classification Learning

no code implementations2 Aug 2019 Puyu Wang, Hai Zhang

By the property of the post-processing holding of differential privacy, the proposed approach satisfies the $\epsilon-$differential privacy even when the original problem is unstable.

Classification General Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.