Search Results for author: Chaobing Song

Found 13 papers, 2 papers with code

A Stochastic Halpern Iteration with Variance Reduction for Stochastic Monotone Inclusion Problems

no code implementations17 Mar 2022 Xufeng Cai, Chaobing Song, Cristóbal Guzmán, Jelena Diakonikolas

We further show how to couple one of the proposed variants of stochastic Halpern iteration with a scheduled restart scheme to solve stochastic monotone inclusion problems with ${\mathcal{O}}(\frac{\log(1/\epsilon)}{\epsilon^2})$ stochastic operator evaluations under additional sharpness or strong monotonicity assumptions.

A Fast Scale-Invariant Algorithm for Non-negative Least Squares with Non-negative Data

no code implementations8 Mar 2022 Jelena Diakonikolas, Chenghui Li, Swati Padmanabhan, Chaobing Song

In particular, while the oracle complexity of unconstrained least squares problems necessarily scales with one of the data matrix constants (typically the spectral norm) and these problems are solved to additive error, we show that nonnegative least squares problems with nonnegative data are solvable to multiplicative error and with complexity that is independent of any matrix constants.

Coordinate Linear Variance Reduction for Generalized Linear Programming

1 code implementation2 Nov 2021 Chaobing Song, Cheuk Yin Lin, Stephen J. Wright, Jelena Diakonikolas

\textsc{clvr} yields improved complexity results for (GLP) that depend on the max row norm of the linear constraint matrix in (GLP) rather than the spectral norm.

Cyclic Coordinate Dual Averaging with Extrapolation for Generalized Variational Inequalities

no code implementations26 Feb 2021 Chaobing Song, Jelena Diakonikolas

This class includes composite convex optimization problems and convex-concave min-max optimization problems as special cases and has not been addressed by the existing work.

Variance Reduction via Primal-Dual Accelerated Dual Averaging for Nonsmooth Convex Finite-Sums

no code implementations26 Feb 2021 Chaobing Song, Stephen J. Wright, Jelena Diakonikolas

We study structured nonsmooth convex finite-sum optimization that appears widely in machine learning applications, including support vector machines and least absolute deviation.

Optimistic Dual Extrapolation for Coherent Non-monotone Variational Inequalities

no code implementations NeurIPS 2020 Chaobing Song, Zhengyuan Zhou, Yichao Zhou, Yong Jiang, Yi Ma

The optimization problems associated with training generative adversarial neural networks can be largely reduced to certain {\em non-monotone} variational inequality problems (VIPs), whereas existing convergence results are mostly based on monotone or strongly monotone assumptions.

Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization

no code implementations NeurIPS 2020 Chaobing Song, Yong Jiang, Yi Ma

Meanwhile, VRADA matches the lower bound of the general convex setting up to a $\log\log n$ factor and matches the lower bounds in both regimes $n\le \Theta(\kappa)$ and $n\gg \kappa$ of the strongly convex setting, where $\kappa$ denotes the condition number.

Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction

2 code implementations NeurIPS 2020 Yaodong Yu, Kwan Ho Ryan Chan, Chong You, Chaobing Song, Yi Ma

To learn intrinsic low-dimensional structures from high-dimensional data that most discriminate between classes, we propose the principle of Maximal Coding Rate Reduction ($\text{MCR}^2$), an information-theoretic measure that maximizes the coding rate difference between the whole dataset and the sum of each individual class.

Contrastive Learning Image Clustering

Unified Acceleration of High-Order Algorithms under Hölder Continuity and Uniform Convexity

no code implementations3 Jun 2019 Chaobing Song, Yong Jiang, Yi Ma

In this general convex setting, we propose a concise unified acceleration framework (UAF), which reconciles the two different high-order acceleration approaches, one by Nesterov and Baes [29, 3, 33] and one by Monteiro and Svaiter [25].

Fully Implicit Online Learning

no code implementations25 Sep 2018 Chaobing Song, Ji Liu, Han Liu, Yong Jiang, Tong Zhang

Regularized online learning is widely used in machine learning applications.

online learning

Nonextensive information theoretical machine

no code implementations21 Apr 2016 Chaobing Song, Shu-Tao Xia

In this paper, we propose a new discriminative model named \emph{nonextensive information theoretical machine (NITM)} based on nonextensive generalization of Shannon information theory.

Bayesian linear regression with Student-t assumptions

no code implementations15 Apr 2016 Chaobing Song, Shu-Tao Xia

In this paper, we propose a Bayesian linear regression model with Student-t assumptions (BLRS), which can be inferred exactly.

Unifying Decision Trees Split Criteria Using Tsallis Entropy

no code implementations25 Nov 2015 Yisen Wang, Chaobing Song, Shu-Tao Xia

In this paper, a Tsallis Entropy Criterion (TEC) algorithm is proposed to unify Shannon entropy, Gain Ratio and Gini index, which generalizes the split criteria of decision trees.

Cannot find the paper you are looking for? You can Submit a new open access paper.