Search Results for author: Lingzhou Xue

Found 20 papers, 0 papers with code

A Graphical Point Process Framework for Understanding Removal Effects in Multi-Touch Attribution

no code implementations13 Feb 2023 Jun Tao, Qian Chen, James W. Snyder Jr., Arava Sai Kumar, Amirhossein Meisami, Lingzhou Xue

Marketers employ various online advertising channels to reach customers, and they are particularly interested in attribution for measuring the degree to which individual touchpoints contribute to an eventual conversion.


Theoretical Guarantees for Sparse Principal Component Analysis based on the Elastic Net

no code implementations29 Dec 2022 Teng Zhang, Haoyi Yang, Lingzhou Xue

Sparse principal component analysis (SPCA) has been widely used for dimensionality reduction and feature extraction in high-dimensional data analysis.

Dimensionality Reduction

Nonlinear Sufficient Dimension Reduction for Distribution-on-Distribution Regression

no code implementations11 Jul 2022 Qi Zhang, Bing Li, Lingzhou Xue

We introduce a novel framework for nonlinear sufficient dimension reduction where both the predictor and the response are distributional data, which are modeled as members of a metric space.

Dimensionality Reduction regression

An additive graphical model for discrete data

no code implementations29 Dec 2021 Jun Tao, Bing Li, Lingzhou Xue

We introduce a nonparametric graphical model for discrete node variables based on additive conditional independence.

Dimension Reduction for Fréchet Regression

no code implementations1 Oct 2021 Qi Zhang, Lingzhou Xue, Bing Li

In this paper, we introduce a flexible sufficient dimension reduction (SDR) method for Fr\'echet regression to achieve two purposes: to mitigate the curse of dimensionality caused by high-dimensional predictors and to provide a visual inspection tool for Fr\'echet regression.

Data Visualization Dimensionality Reduction +1

Robust High-Dimensional Regression with Coefficient Thresholding and its Application to Imaging Data Analysis

no code implementations30 Sep 2021 Bingyuan Liu, Qi Zhang, Lingzhou Xue, Peter X. K. Song, Jian Kang

It is of importance to develop statistical techniques to analyze high-dimensional data in the presence of both complex dependence and possible outliers in real-world applications such as imaging data analyses.

Association regression

Improving Neural Network Robustness through Neighborhood Preserving Layers

no code implementations28 Jan 2021 Bingyuan Liu, Christopher Malon, Lingzhou Xue, Erik Kruus

Finally, we empirically show that our designed network architecture is more robust against state-of-art gradient descent based attacks, such as a PGD attack on the benchmark datasets MNIST and CIFAR10.

Adversarial Attack

A Manifold Proximal Linear Method for Sparse Spectral Clustering with Application to Single-Cell RNA Sequencing Data Analysis

no code implementations18 Jul 2020 Zhongruo Wang, Bingyuan Liu, Shixiang Chen, Shiqian Ma, Lingzhou Xue, Hongyu Zhao

This paper considers a widely adopted model for SSC, which can be formulated as an optimization problem over the Stiefel manifold with nonsmooth and nonconvex objective.

Fisher's combined probability test for high-dimensional covariance matrices

no code implementations31 May 2020 Xiufan Yu, Danning Li, Lingzhou Xue

Testing large covariance matrices is of fundamental importance in statistical analysis with high-dimensional data.

Riemannian Stochastic Proximal Gradient Methods for Nonsmooth Optimization over the Stiefel Manifold

no code implementations3 May 2020 Bokun Wang, Shiqian Ma, Lingzhou Xue

However, most of the existing Riemannian stochastic algorithms require the objective function to be differentiable, and they do not apply to the case where the objective function is nonsmooth.

Low-Rank Matrix Completion Riemannian optimization

Understanding Attention Mechanisms

no code implementations25 Sep 2019 Bingyuan Liu, Yogesh Balaji, Lingzhou Xue, Martin Renqiang Min

Attention mechanisms have advanced the state of the art in several machine learning tasks.

An Alternating Manifold Proximal Gradient Method for Sparse PCA and Sparse CCA

no code implementations27 Mar 2019 Shixiang Chen, Shiqian Ma, Lingzhou Xue, Hui Zou

Sparse principal component analysis (PCA) and sparse canonical correlation analysis (CCA) are two essential techniques from high-dimensional statistics and machine learning for analyzing large-scale data.

Model-Based Clustering of Nonparametric Weighted Networks with Application to Water Pollution Analysis

no code implementations21 Dec 2017 Amal Agarwal, Lingzhou Xue

The power of our proposed methods is demonstrated in simulation studies and a real application to sulfate pollution network analysis in Ohio watershed located in Pennsylvania, United States.

Model-Based Clustering of Time-Evolving Networks through Temporal Exponential-Family Random Graph Models

no code implementations20 Dec 2017 Kevin H. Lee, Lingzhou Xue, David R. Hunter

To choose the number of communities, we use conditional likelihood to construct an effective model selection criterion.

Model Selection

Inverse Moment Methods for Sufficient Forecasting using High-Dimensional Predictors

no code implementations1 May 2017 Wei Luo, Lingzhou Xue, Jiawei Yao, Xiufan Yu

Assuming that the predictors affect the response through the latent factors, we propose to first conduct factor analysis and then apply sufficient dimension reduction on the estimated factors, to derive the reduced data for subsequent forecasting.

Dimensionality Reduction Model Selection +1

Nonparametric mixture of Gaussian graphical models

no code implementations31 Dec 2015 Kevin Lee, Lingzhou Xue

Graphical model has been widely used to investigate the complex dependence structure of high-dimensional data, and it is common to assume that observed data follow a homogeneous graphical model.

Joint limiting laws for high-dimensional independence tests

no code implementations30 Dec 2015 Danning Li, Lingzhou Xue

Using extreme-value form statistics to test against sparse alternatives and using quadratic form statistics to test against dense alternatives are two important testing procedures for high-dimensional independence.

Sufficient Forecasting Using Factor Models

no code implementations27 May 2015 Jianqing Fan, Lingzhou Xue, Jiawei Yao

Our method and theory allow the number of predictors to be larger than the number of observations.

Dimensionality Reduction regression +1

Strong oracle optimality of folded concave penalized estimation

no code implementations22 Oct 2012 Jianqing Fan, Lingzhou Xue, Hui Zou

Folded concave penalization methods have been shown to enjoy the strong oracle property for high-dimensional sparse estimation.


Cannot find the paper you are looking for? You can Submit a new open access paper.