no code implementations • 12 Oct 2022 • Yi Sui, Junfeng Wen, Yenson Lau, Brendan Leigh Ross, Jesse C. Cresswell
In the traditional federated learning setting, a central server coordinates a network of clients to train one global model.
1 code implementation • ICLR 2020 • Yenson Lau, Qing Qu, Han-Wen Kuo, Pengcheng Zhou, Yuqian Zhang, John Wright
Short-and-sparse deconvolution (SaSD) is the problem of extracting localized, recurring motifs in signals with spatial or temporal structure.
1 code implementation • 28 Aug 2019 • Yenson Lau, Qing Qu, Han-Wen Kuo, Pengcheng Zhou, Yuqian Zhang, John Wright
This paper is motivated by recent theoretical advances, which characterize the optimization landscape of a particular nonconvex formulation of SaSD.
no code implementations • CVPR 2017 • Yuqian Zhang, Yenson Lau, Han-Wen Kuo, Sky Cheung, Abhay Pasupathy, John Wright
Blind deconvolution is the problem of recovering a convolutional kernel $\boldsymbol a_0$ and an activation signal $\boldsymbol x_0$ from their convolution $\boldsymbol y = \boldsymbol a_0 \circledast \boldsymbol x_0$.
no code implementations • 2 Jan 2019 • Han-Wen Kuo, Yenson Lau, Yuqian Zhang, John Wright
We study the $\textit{Short-and-Sparse (SaS) deconvolution}$ problem of recovering a short signal $\mathbf a_0$ and a sparse signal $\mathbf x_0$ from their convolution.