no code implementations • 4 Nov 2024 • Jiaxin Zhuang, Leon Yan, Zhenwei Zhang, Ruiqi Wang, Jiawei Zhang, Yuantao Gu
Time series anomaly detection (TSAD) is becoming increasingly vital due to the rapid growth of time series data across various sectors.
1 code implementation • 11 Jun 2024 • Jiawei Zhang, Jiaxin Zhuang, Cheng Jin, Gen Li, Yuantao Gu
The proposed algorithm, termed ProjDiff, effectively harnesses the prior information and the denoising capability of a pre-trained diffusion model within the optimization framework.
no code implementations • 3 Jan 2024 • Jiawei Zhang, Yufan Chen, Cheng Jin, Lei Zhu, Yuantao Gu
Out-of-distribution (OOD) detection plays a crucial role in ensuring the security of neural networks.
1 code implementation • 30 Sep 2023 • Zhenwei Zhang, Ruiqi Wang, Ran Ding, Yuantao Gu
Traditional Time-series Anomaly Detection (TAD) methods often struggle with the composite nature of complex time-series data and a diverse array of anomalies.
no code implementations • 10 Sep 2023 • Xiaolu Wang, Cheng Jin, Hoi-To Wai, Yuantao Gu
This paper considers a type of incremental aggregated gradient (IAG) method for large-scale distributed optimization.
1 code implementation • 4 Jul 2023 • Zhenwei Zhang, Linghang Meng, Yuantao Gu
To bridge this gap, this paper introduces a novel series-aware framework, explicitly designed to emphasize the significance of such dependencies.
1 code implementation • 4 Jul 2023 • Zhenwei Zhang, Xin Wang, Jingyuan Xie, Heling Zhang, Yuantao Gu
Unlocking the potential of deep learning in Peak-Hour Series Forecasting (PHSF) remains a critical yet underexplored task in various domains.
no code implementations • 29 Sep 2021 • Gen Li, Ganghua Wang, Yuantao Gu, Jie Ding
In this paper, the territory of LASSO is extended to the neural network model, a fashionable and powerful nonlinear regression model.
no code implementations • NeurIPS 2021 • Gen Li, Yuxin Chen, Yuejie Chi, Yuantao Gu, Yuting Wei
The current paper pertains to a scenario with value-based linear representation, which postulates the linear realizability of the optimal Q-function (also called the "linear $Q^{\star}$ problem").
no code implementations • 1 Jan 2021 • Gen Li, Yuantao Gu, Jie Ding
A crucial problem in neural networks is to select the most appropriate number of hidden neurons and obtain tight statistical risk bounds.
no code implementations • 2 Oct 2020 • Gen Li, Yuantao Gu, Jie Ding
A crucial problem in neural networks is to select the most appropriate number of hidden neurons and obtain tight statistical risk bounds.
no code implementations • NeurIPS 2020 • Gen Li, Yuting Wei, Yuejie Chi, Yuantao Gu, Yuxin Chen
Focusing on a $\gamma$-discounted MDP with state space $\mathcal{S}$ and action space $\mathcal{A}$, we demonstrate that the $\ell_{\infty}$-based sample complexity of classical asynchronous Q-learning --- namely, the number of samples needed to yield an entrywise $\varepsilon$-accurate estimate of the Q-function --- is at most on the order of $\frac{1}{\mu_{\min}(1-\gamma)^5\varepsilon^2}+ \frac{t_{mix}}{\mu_{\min}(1-\gamma)}$ up to some logarithmic factor, provided that a proper constant learning rate is adopted.
no code implementations • 26 Dec 2019 • Qi Zhang, Jiang Zhu, Yuantao Gu, Zhiwei Xu
This paper studies DOA in heteroscedastic noise (HN) environment, where the variance of noise is varied across the snapshots and the antennas.
no code implementations • 25 Jul 2019 • Gen Li, Yuantao Gu
Spectral Method is a commonly used scheme to cluster data points lying close to Union of Subspaces by first constructing a Random Geometry Graph, called Subspace Clustering.
no code implementations • 14 Jul 2019 • Yuchen Jiao, Gen Li, Yuantao Gu
In this paper, we prove that random projection with the so-called Johnson-Lindenstrauss (JL) property approximately preserves canonical angles between subspaces with overwhelming probability.
no code implementations • 23 May 2019 • Xingyu Xv, Gen Li, Yuantao Gu
Subspace Restricted Isometry Property, a newly-proposed concept, has proved to be a useful tool in analyzing the effect of dimensionality reduction algorithms on subspaces.
no code implementations • 30 Jan 2018 • Gen Li, Qinghua Liu, Yuantao Gu
As an analogy to JL Lemma and RIP for sparse vectors, this work allows the use of random projections to reduce the ambient dimension with the theoretical guarantee that the distance between subspaces after compression is well preserved.
no code implementations • 5 Dec 2017 • Gen Li, Yuchen Jiao, Yuantao Gu
In this work, we study for the first time, without the independence assumption, the convergence behavior of the randomized Kaczmarz method for phase retrieval.
no code implementations • 16 Aug 2017 • Yanxi Chen, Gen Li, Yuantao Gu
In this letter, we propose a novel Active OMP-SSC, which improves clustering accuracy of OMP-SSC by adaptively updating data points and randomly dropping data points in the OMP process, while still enjoying the low computational complexity of greedy pursuit algorithms.
no code implementations • 7 Aug 2017 • Xinyue Shen, Yuantao Gu
In this work we propose to fit a sparse logistic regression model by a weakly convex regularized nonconvex optimization problem.
no code implementations • 8 May 2017 • Xiudong Wang, Yuantao Gu
This paper addresses image classification through learning a compact and discriminative dictionary efficiently.
no code implementations • 7 Apr 2017 • Gen Li, Yuantao Gu
Dimension reduction plays an essential role when decreasing the complexity of solving large-scale problems.
3 code implementations • 12 Sep 2016 • Xinyue Shen, Steven Diamond, Madeleine Udell, Yuantao Gu, Stephen Boyd
A multi-convex optimization problem is one in which the variables can be partitioned into sets over which the problem is convex when the other variables are fixed.
Optimization and Control
no code implementations • 12 Nov 2015 • Mengdi Wang, Yi-Chen Chen, Jialin Liu, Yuantao Gu
Consider convex optimization problems subject to a large number of constraints.