Search Results for author: Jun S. Liu

Found 14 papers, 3 papers with code

Multi-Response Heteroscedastic Gaussian Process Models and Their Inference

no code implementations29 Aug 2023 TaeHee Lee, Jun S. Liu

By overcoming the limitations of traditional Gaussian process models, our proposed framework offers a robust and versatile tool for a wide array of applications.

regression Variational Inference

Neural Gaussian Mirror for Controlled Feature Selection in Neural Networks

no code implementations13 Oct 2020 Xin Xing, Yu Gui, Chenguang Dai, Jun S. Liu

Deep neural networks (DNNs) have become increasingly popular and achieved outstanding performance in predictive tasks.

Feature Importance feature selection

Measurement error models: from nonparametric methods to deep neural networks

no code implementations15 Jul 2020 Zhirui Hu, Zheng Tracy Ke, Jun S. Liu

The success of deep learning has inspired recent interests in applying neural networks in statistical inference.

Efficient Neural Network regression +1

Probabilistic Connection Importance Inference and Lossless Compression of Deep Neural Networks

no code implementations ICLR 2020 Xin Xing, Long Sha, Pengyu Hong, Zuofeng Shang, Jun S. Liu

Deep neural networks (DNNs) can be huge in size, requiring a considerable a mount of energy and computational resources to operate, which limits their applications in numerous scenarios.

False Discovery Rate Control via Data Splitting

1 code implementation20 Feb 2020 Chenguang Dai, Buyu Lin, Xin Xing, Jun S. Liu

Selecting relevant features associated with a given response variable is an important issue in many scientific fields.

Methodology

Minimax Nonparametric Two-sample Test under Smoothing

no code implementations6 Nov 2019 Xin Xing, Zuofeng Shang, Pang Du, Ping Ma, Wenxuan Zhong, Jun S. Liu

Under such a framework, the probability density comparison is equivalent to testing the presence/absence of interactions.

Two-sample testing Vocal Bursts Valence Prediction

The Wang-Landau Algorithm as Stochastic Optimization and Its Acceleration

no code implementations27 Jul 2019 Chenguang Dai, Jun S. Liu

The optimization formulation provides us a new way to establish the convergence rate of the Wang-Landau algorithm, by exploiting the fact that almost surely, the density estimates (on the logarithmic scale) remain in a compact set, upon which the objective function is strongly convex.

Stochastic Optimization

Generative Parameter Sampler For Scalable Uncertainty Quantification

no code implementations28 May 2019 Minsuk Shin, Young Lee, Jun S. Liu

Uncertainty quantification has been a core of the statistical machine learning, but its computational bottleneck has been a serious challenge for both Bayesians and frequentists.

General Classification Uncertainty Quantification

Sentence Segmentation for Classical Chinese Based on LSTM with Radical Embedding

no code implementations5 Oct 2018 Xu Han, Hongsu Wang, Sanqian Zhang, Qunchao Fu, Jun S. Liu

In this paper, we develop a low than character feature embedding called radical embedding, and apply it on LSTM model for sentence segmentation of pre modern Chinese texts.

Segmentation Sentence +1

L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs

no code implementations25 Nov 2015 Matey Neykov, Jun S. Liu, Tianxi Cai

In the present paper we analyze algorithms based on covariance screening and least squares with $L_1$ penalization (i. e. LASSO) and demonstrate that they can also enjoy optimal (up to a scalar) rescaled sample size in terms of support recovery, albeit under slightly different assumptions on $f$ and $\varepsilon$ compared to the SIR based algorithms.

Signed Support Recovery for Single Index Models in High-Dimensions

no code implementations7 Nov 2015 Matey Neykov, Qian Lin, Jun S. Liu

When $s=O(p^{1-\delta})$ for some $\delta>0$, we demonstrate that both procedures can succeed in recovering the support of $\boldsymbol{\beta}$ as long as the rescaled sample size $\kappa=\frac{n}{s\log(p-s)}$ is larger than a certain critical threshold.

Vocal Bursts Intensity Prediction

A Unified Theory of Confidence Regions and Testing for High Dimensional Estimating Equations

no code implementations30 Oct 2015 Matey Neykov, Yang Ning, Jun S. Liu, Han Liu

Our main theoretical contribution is to establish a unified Z-estimation theory of confidence regions for high dimensional problems.

valid

Interpretable Selection and Visualization of Features and Interactions Using Bayesian Forests

1 code implementation8 Jun 2015 Viktoriya Krakovna, Jiong Du, Jun S. Liu

In many practical applications, it is of interest which features and feature interactions are relevant to the prediction task.

feature selection General Classification

Cannot find the paper you are looking for? You can Submit a new open access paper.