Search Results for author: Chengrui Li

Found 6 papers, 2 papers with code

Multi-Region Markovian Gaussian Process: An Efficient Method to Discover Directional Communications Across Multiple Brain Regions

no code implementations5 Feb 2024 Weihan Li, Chengrui Li, Yule Wang, Anqi Wu

Consequently, the model achieves a linear inference cost over time points and provides an interpretable low-dimensional representation, revealing communication directions across brain regions and separating oscillatory communications into different frequency bands.

A Differentiable Partially Observable Generalized Linear Model with Forward-Backward Message Passing

no code implementations2 Feb 2024 Chengrui Li, Weihan Li, Yule Wang, Anqi Wu

For (1), we propose a new differentiable POGLM, which enables the pathwise gradient estimator, better than the score function gradient estimator used in existing works.

Variational Inference

Forward $χ^2$ Divergence Based Variational Importance Sampling

no code implementations4 Nov 2023 Chengrui Li, Yule Wang, Weihan Li, Anqi Wu

Maximizing the log-likelihood is a crucial aspect of learning latent variable models, and variational inference (VI) stands as the commonly adopted method.

Variational Inference

One-hot Generalized Linear Model for Switching Brain State Discovery

no code implementations23 Oct 2023 Chengrui Li, Soon Ho Kim, Chris Rodgers, Hannah Choi, Anqi Wu

We introduce both a Gaussian prior and a one-hot prior over the GLM in each state.

Extraction and Recovery of Spatio-Temporal Structure in Latent Dynamics Alignment with Diffusion Models

1 code implementation9 Jun 2023 Yule Wang, Zijing Wu, Chengrui Li, Anqi Wu

Specifically, the latent dynamics structures of the source domain are first extracted by a diffusion model.

Inverse Kernel Decomposition

1 code implementation11 Nov 2022 Chengrui Li, Anqi Wu

To deal with very noisy data with weak correlations, we propose two solutions -- blockwise and geodesic -- to make use of locally correlated data points and provide better and numerically more stable latent estimations.

Dimensionality Reduction

Cannot find the paper you are looking for? You can Submit a new open access paper.