Search Results for author: Meixia Lin

Found 8 papers, 1 papers with code

DNNLasso: Scalable Graph Learning for Matrix-Variate Data

1 code implementation5 Mar 2024 Meixia Lin, Yangjing Zhang

We consider the problem of jointly learning row-wise and column-wise dependencies of matrix-variate observations, which are modelled separately by two precision matrices.

Graph Learning

Determinantal point processes based on orthogonal polynomials for sampling minibatches in SGD

no code implementations NeurIPS 2021 Remi Bardenet, Subhro Ghosh, Meixia Lin

In particular, we show how specific DPPs and a string of controlled approximations can lead to gradient estimators with a variance that decays faster with the batchsize than under uniform sampling.

Point Processes

Signal Analysis via the Stochastic Geometry of Spectrogram Level Sets

no code implementations6 May 2021 Subhroshekhar Ghosh, Meixia Lin, Dongfang Sun

In this work, we investigate spectrogram analysis via an examination of the stochastic geometric properties of their level sets.

Estimation of sparse Gaussian graphical models with hidden clustering structure

no code implementations17 Apr 2020 Meixia Lin, Defeng Sun, Kim-Chuan Toh, Chengjing Wang

The sparsity and clustering structure of the concentration matrix is enforced to reduce model complexity and describe inherent regularities.

Clustering

Efficient algorithms for multivariate shape-constrained convex regression problems

no code implementations26 Feb 2020 Meixia Lin, Defeng Sun, Kim-Chuan Toh

We prove that the least squares estimator is computable via solving a constrained convex quadratic programming (QP) problem with $(n+1)d$ variables and at least $n(n-1)$ linear inequality constraints, where $n$ is the number of data points.

regression

A dual Newton based preconditioned proximal point algorithm for exclusive lasso models

no code implementations1 Feb 2019 Meixia Lin, Defeng Sun, Kim-Chuan Toh, Yancheng Yuan

In addition, we derive the corresponding HS-Jacobian to the proximal mapping and analyze its structure --- which plays an essential role in the efficient computation of the PPA subproblem via applying a semismooth Newton method on its dual.

Efficient sparse semismooth Newton methods for the clustered lasso problem

no code implementations22 Aug 2018 Meixia Lin, Yong-Jin Liu, Defeng Sun, Kim-Chuan Toh

Based on the new formulation, we derive an efficient procedure for its computation.

Cannot find the paper you are looking for? You can Submit a new open access paper.