Search Results for author: Chaobing Song

Found 15 papers, 3 papers with code

Learning Diverse and Discriminative Representations via the Principle of Maximal Coding Rate Reduction

2 code implementations NeurIPS 2020 Yaodong Yu, Kwan Ho Ryan Chan, Chong You, Chaobing Song, Yi Ma

To learn intrinsic low-dimensional structures from high-dimensional data that most discriminate between classes, we propose the principle of Maximal Coding Rate Reduction ($\text{MCR}^2$), an information-theoretic measure that maximizes the coding rate difference between the whole dataset and the sum of each individual class.

Clustering Contrastive Learning +1

Unifying Decision Trees Split Criteria Using Tsallis Entropy

no code implementations25 Nov 2015 Yisen Wang, Chaobing Song, Shu-Tao Xia

In this paper, a Tsallis Entropy Criterion (TEC) algorithm is proposed to unify Shannon entropy, Gain Ratio and Gini index, which generalizes the split criteria of decision trees.

Nonextensive information theoretical machine

no code implementations21 Apr 2016 Chaobing Song, Shu-Tao Xia

In this paper, we propose a new discriminative model named \emph{nonextensive information theoretical machine (NITM)} based on nonextensive generalization of Shannon information theory.

Bayesian linear regression with Student-t assumptions

no code implementations15 Apr 2016 Chaobing Song, Shu-Tao Xia

In this paper, we propose a Bayesian linear regression model with Student-t assumptions (BLRS), which can be inferred exactly.

regression

Fully Implicit Online Learning

no code implementations25 Sep 2018 Chaobing Song, Ji Liu, Han Liu, Yong Jiang, Tong Zhang

Regularized online learning is widely used in machine learning applications.

Unified Acceleration of High-Order Algorithms under Hölder Continuity and Uniform Convexity

no code implementations3 Jun 2019 Chaobing Song, Yong Jiang, Yi Ma

In this general convex setting, we propose a concise unified acceleration framework (UAF), which reconciles the two different high-order acceleration approaches, one by Nesterov and Baes [29, 3, 33] and one by Monteiro and Svaiter [25].

Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization

no code implementations NeurIPS 2020 Chaobing Song, Yong Jiang, Yi Ma

Meanwhile, VRADA matches the lower bound of the general convex setting up to a $\log\log n$ factor and matches the lower bounds in both regimes $n\le \Theta(\kappa)$ and $n\gg \kappa$ of the strongly convex setting, where $\kappa$ denotes the condition number.

Optimistic Dual Extrapolation for Coherent Non-monotone Variational Inequalities

no code implementations NeurIPS 2020 Chaobing Song, Zhengyuan Zhou, Yichao Zhou, Yong Jiang, Yi Ma

The optimization problems associated with training generative adversarial neural networks can be largely reduced to certain {\em non-monotone} variational inequality problems (VIPs), whereas existing convergence results are mostly based on monotone or strongly monotone assumptions.

Variance Reduction via Primal-Dual Accelerated Dual Averaging for Nonsmooth Convex Finite-Sums

no code implementations26 Feb 2021 Chaobing Song, Stephen J. Wright, Jelena Diakonikolas

We study structured nonsmooth convex finite-sum optimization that appears widely in machine learning applications, including support vector machines and least absolute deviation.

Cyclic Coordinate Dual Averaging with Extrapolation

no code implementations26 Feb 2021 Chaobing Song, Jelena Diakonikolas

This class includes composite convex optimization problems and convex-concave min-max optimization problems as special cases and has not been addressed by the existing work.

Coordinate Linear Variance Reduction for Generalized Linear Programming

1 code implementation2 Nov 2021 Chaobing Song, Cheuk Yin Lin, Stephen J. Wright, Jelena Diakonikolas

\textsc{clvr} yields improved complexity results for (GLP) that depend on the max row norm of the linear constraint matrix in (GLP) rather than the spectral norm.

A Fast Scale-Invariant Algorithm for Non-negative Least Squares with Non-negative Data

no code implementations8 Mar 2022 Jelena Diakonikolas, Chenghui Li, Swati Padmanabhan, Chaobing Song

In particular, while the oracle complexity of unconstrained least squares problems necessarily scales with one of the data matrix constants (typically the spectral norm) and these problems are solved to additive error, we show that nonnegative least squares problems with nonnegative data are solvable to multiplicative error and with complexity that is independent of any matrix constants.

Stochastic Halpern Iteration with Variance Reduction for Stochastic Monotone Inclusions

1 code implementation17 Mar 2022 Xufeng Cai, Chaobing Song, Cristóbal Guzmán, Jelena Diakonikolas

We study stochastic monotone inclusion problems, which widely appear in machine learning applications, including robust regression and adversarial learning.

Cyclic Block Coordinate Descent With Variance Reduction for Composite Nonconvex Optimization

no code implementations9 Dec 2022 Xufeng Cai, Chaobing Song, Stephen J. Wright, Jelena Diakonikolas

Our convergence analysis is based on a gradient Lipschitz condition with respect to a Mahalanobis norm, inspired by a recent progress on cyclic block coordinate methods.

Accelerated Cyclic Coordinate Dual Averaging with Extrapolation for Composite Convex Optimization

no code implementations28 Mar 2023 Cheuk Yin Lin, Chaobing Song, Jelena Diakonikolas

Exploiting partial first-order information in a cyclic way is arguably the most natural strategy to obtain scalable first-order methods.

Cannot find the paper you are looking for? You can Submit a new open access paper.