Search Results for author: Linjian Ma

Found 8 papers, 4 papers with code

Accelerating Alternating Least Squares for Tensor Decomposition by Pairwise Perturbation

2 code implementations26 Nov 2018 Linjian Ma, Edgar Solomonik

The alternating least squares algorithm for CP and Tucker decomposition is dominated in cost by the tensor contractions necessary to set up the quadratic optimization subproblems.

Numerical Analysis Numerical Analysis

Inefficiency of K-FAC for Large Batch Size Training

no code implementations14 Mar 2019 Linjian Ma, Gabe Montague, Jiayu Ye, Zhewei Yao, Amir Gholami, Kurt Keutzer, Michael W. Mahoney

In stochastic optimization, using large batch sizes during training can leverage parallel resources to produce faster wall-clock training times per training epoch.

Stochastic Optimization

AutoHOOT: Automatic High-Order Optimization for Tensors

1 code implementation10 May 2020 Linjian Ma, Jiayu Ye, Edgar Solomonik

High-order optimization methods, including Newton's method and its variants as well as alternating minimization methods, dominate the optimization algorithms for tensor decompositions and tensor networks.

Mathematical Software Numerical Analysis Numerical Analysis

Fast and Accurate Randomized Algorithms for Low-rank Tensor Decompositions

no code implementations NeurIPS 2021 Linjian Ma, Edgar Solomonik

Experimental results show that this new ALS algorithm, combined with a new initialization scheme based on randomized range finder, yields up to $22. 0\%$ relative decomposition residual improvement compared to the state-of-the-art sketched randomized algorithm for Tucker decomposition of various synthetic and real datasets.

LEAP: Learnable Pruning for Transformer-based Models

1 code implementation30 May 2021 Zhewei Yao, Xiaoxia Wu, Linjian Ma, Sheng Shen, Kurt Keutzer, Michael W. Mahoney, Yuxiong He

Moreover, in order to reduce hyperparameter tuning, a novel adaptive regularization coefficient is deployed to control the regularization penalty adaptively.

QQP

Cost-efficient Gaussian Tensor Network Embeddings for Tensor-structured Inputs

no code implementations26 May 2022 Linjian Ma, Edgar Solomonik

We provide a systematic way to design tensor network embeddings consisting of Gaussian random tensors, such that for inputs with more general tensor network structures, both the sketch size (row size of $S$) and the sketching computational cost are low.

Dimensionality Reduction Tensor Decomposition

TongueSAM: An Universal Tongue Segmentation Model Based on SAM with Zero-Shot

1 code implementation12 Aug 2023 Shan Cao, Qunsheng Ruan, Linjian Ma

To address this issue, this paper proposes a universal tongue segmentation model named TongueSAM based on SAM (Segment Anything Model).

Interactive Segmentation object-detection +4

Cannot find the paper you are looking for? You can Submit a new open access paper.