no code implementations • 6 Feb 2024 • Li Guo, Keith Ross, Zifan Zhao, George Andriopoulos, Shuyang Ling, Yufeng Xu, Zixuan Dong
We first show empirically that models trained with label smoothing converge faster to neural collapse solutions and attain a stronger level of neural collapse.
1 code implementation • 18 Sep 2023 • Wanli Hong, Shuyang Ling
In this paper, we study the extension of N C phenomenon to the imbalanced data under cross-entropy loss function in the context of unconstrained feature model.
no code implementations • 7 Sep 2023 • Ziliang Samuel Zhong, Shuyang Ling
As it is generally NP-hard to find a global ranking that minimizes the mismatch (known as the Kemeny optimization), we focus on the Erd\"os-R\'enyi outliers (ERO) model for this ranking problem.
no code implementations • 29 Jun 2021 • Shuyang Ling
We study the semidefinite relaxation (SDR) and an iterative method named generalized power method (GPM) to find the least squares estimator, and investigate the performance under a signal-plus-noise model.
no code implementations • 12 Aug 2020 • Shuyang Ling
In particular, for orthogonal group synchronization, we obtain a near-optimal performance bound for the group recovery in presence of additive Gaussian noise.
no code implementations • 21 Apr 2020 • Shaofeng Deng, Shuyang Ling, Thomas Strohmer
We study the performance of classical two-step spectral clustering via the graph Laplacian to learn the stochastic block model.
no code implementations • 29 Jun 2018 • Shuyang Ling, Thomas Strohmer
This paper is devoted to the theoretical foundations of spectral clustering and graph cuts.
1 code implementation • 15 Jun 2016 • XiaoDong Li, Shuyang Ling, Thomas Strohmer, Ke Wei
To the best of our knowledge, our algorithm is the first blind deconvolution algorithm that is numerically efficient, robust against noise, and comes with rigorous recovery guarantees under certain subspace conditions.
Information Theory Information Theory