Search Results for author: Tianxiang Gao

Found 9 papers, 0 papers with code

On the optimization and generalization of overparameterized implicit neural networks

no code implementations30 Sep 2022 Tianxiang Gao, Hongyang Gao

We show that global convergence is guaranteed, even if only the implicit layer is trained.

Gradient Descent Optimizes Infinite-Depth ReLU Implicit Networks with Linear Widths

no code implementations16 May 2022 Tianxiang Gao, Hongyang Gao

Implicit deep learning has recently become popular in the machine learning community since these implicit models can achieve competitive performance with state-of-the-art deep networks while using significantly less memory and computational resources.

A global convergence theory for deep ReLU implicit networks via over-parameterization

no code implementations ICLR 2022 Tianxiang Gao, Hailiang Liu, Jia Liu, Hridesh Rajan, Hongyang Gao

Implicit deep learning has received increasing attention recently due to the fact that it generalizes the recursive prediction rules of many commonly used neural network architectures.

Alternate Model Growth and Pruning for Efficient Training of Recommendation Systems

no code implementations4 May 2021 Xiaocong Du, Bhargav Bhushanam, Jiecao Yu, Dhruv Choudhary, Tianxiang Gao, Sherman Wong, Louis Feng, Jongsoo Park, Yu Cao, Arun Kejariwal

Our method leverages structured sparsification to reduce computational cost without hurting the model capacity at the end of offline training so that a full-size model is available in the recurring training stage to learn new data in real-time.

Recommendation Systems

Randomized Bregman Coordinate Descent Methods for Non-Lipschitz Optimization

no code implementations15 Jan 2020 Tianxiang Gao, Songtao Lu, Jia Liu, Chris Chu

Further, we show that the iteration complexity of the proposed method is $O(n\varepsilon^{-2})$ to achieve $\epsilon$-stationary point, where $n$ is the number of blocks of coordinates.

Translation

Leveraging Two Reference Functions in Block Bregman Proximal Gradient Descent for Non-convex and Non-Lipschitz Problems

no code implementations16 Dec 2019 Tianxiang Gao, Songtao Lu, Jia Liu, Chris Chu

In the applications of signal processing and data analytics, there is a wide class of non-convex problems whose objective function is freed from the common global Lipschitz continuous gradient assumption (e. g., the nonnegative matrix factorization (NMF) problem).

A Forest from the Trees: Generation through Neighborhoods

no code implementations4 Feb 2019 Yang Li, Tianxiang Gao, Junier B. Oliva

In this work, we propose to learn a generative model using both learned features (through a latent space) and memories (through neighbors).

DID: Distributed Incremental Block Coordinate Descent for Nonnegative Matrix Factorization

no code implementations25 Feb 2018 Tianxiang Gao, Chris Chu

We propose a novel distributed algorithm, called \textit{distributed incremental block coordinate descent} (DID), to solve the problem.

Dimensionality Reduction

Degrees of Freedom in Deep Neural Networks

no code implementations30 Mar 2016 Tianxiang Gao, Vladimir Jojic

The degrees of freedom in deep networks are dramatically smaller than the number of parameters, in some real datasets several orders of magnitude.

General Classification Multi-class Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.