Search Results for author: Lican Kang

Found 6 papers, 0 papers with code

Latent Schr{ö}dinger Bridge Diffusion Model for Generative Learning

no code implementations20 Apr 2024 Yuling Jiao, Lican Kang, Huazhen Lin, Jin Liu, Heng Zuo

Our theoretical analysis encompasses the establishment of end-to-end error analysis for learning distributions via the latent Schr{\"o}dinger bridge diffusion model.

An empirical study of the effect of background data size on the stability of SHapley Additive exPlanations (SHAP) for deep learning models

no code implementations24 Apr 2022 Han Yuan, Mingxuan Liu, Lican Kang, Chenkui Miao, Ying Wu

In our empirical study on the MIMIC-III dataset, we show that the two core explanations - SHAP values and variable rankings fluctuate when using different background datasets acquired from random sampling, indicating that users cannot unquestioningly trust the one-shot interpretation from SHAP.

A Data-Driven Line Search Rule for Support Recovery in High-dimensional Data Analysis

no code implementations21 Nov 2021 Peili Li, Yuling Jiao, Xiliang Lu, Lican Kang

In this work, we consider the algorithm to the (nonlinear) regression problems with $\ell_0$ penalty.

regression

Convergence Analysis of Schr{ö}dinger-F{ö}llmer Sampler without Convexity

no code implementations10 Jul 2021 Yuling Jiao, Lican Kang, Yanyan Liu, Youzhou Zhou

Schr\"{o}dinger-F\"{o}llmer sampler (SFS) is a novel and efficient approach for sampling from possibly unnormalized distributions without ergodicity.

On Newton Screening

no code implementations27 Jan 2020 Jian Huang, Yuling Jiao, Lican Kang, Jin Liu, Yanyan Liu, Xiliang Lu, Yuanyuan Yang

Based on this KKT system, a built-in working set with a relatively small size is first determined using the sum of primal and dual variables generated from the previous iteration, then the primal variable is updated by solving a least-squares problem on the working set and the dual variable updated based on a closed-form expression.

Sparse Learning

A Support Detection and Root Finding Approach for Learning High-dimensional Generalized Linear Models

no code implementations16 Jan 2020 Jian Huang, Yuling Jiao, Lican Kang, Jin Liu, Yanyan Liu, Xiliang Lu

Feature selection is important for modeling high-dimensional data, where the number of variables can be much larger than the sample size.

feature selection

Cannot find the paper you are looking for? You can Submit a new open access paper.