Search Results for author: Ruilin Li

Found 13 papers, 2 papers with code

Brain-JEPA: Brain Dynamics Foundation Model with Gradient Positioning and Spatiotemporal Masking

no code implementations28 Sep 2024 Zijian Dong, Ruilin Li, Yilei Wu, Thuan Tinh Nguyen, Joanna Su Xian Chong, Fang Ji, Nathanael Ren Jie Tong, Christopher Li Hsian Chen, Juan Helen Zhou

Brain Gradient Positioning introduces a functional coordinate system for brain functional parcellation, enhancing the positional encoding of different Regions of Interest (ROIs).

Shadow-Enlightened Image Outpainting

no code implementations CVPR 2024 Hang Yu, Ruilin Li, Shaorong Xie, Jiayan Qiu

In this paper we propose to extract and utilize the hidden information of unobserved areas from their shadows to enhance image outpainting.

Image Outpainting

A Decomposition-Based Hybrid Ensemble CNN Framework for Driver Fatigue Recognition

no code implementations14 Mar 2022 Ruilin Li, Ruobin Gao, P. N. Suganthan

Specifically, the performance of different decomposition methods and ensemble modes was further compared.

EEG Eeg Decoding +1

Towards Best Practice of Interpreting Deep Learning Models for EEG-based Brain Computer Interfaces

1 code implementation12 Feb 2022 Jian Cui, Liqiang Yuan, Zhaoxiang Wang, Ruilin Li, Tianzi Jiang

In addition, we also find that the quality of the interpretation results is inconsistent for individual samples despite when a method with an overall good performance is used.

EEG

The Mirror Langevin Algorithm Converges with Vanishing Bias

no code implementations24 Sep 2021 Ruilin Li, Molei Tao, Santosh S. Vempala, Andre Wibisono

The Mirror Langevin Diffusion (MLD) is a sampling analogue of mirror flow in continuous time, and it has nice convergence properties under log-Sobolev or Poincare inequalities relative to the Hessian metric, as shown by Chewi et al. (2020).

Sqrt(d) Dimension Dependence of Langevin Monte Carlo

no code implementations ICLR 2022 Ruilin Li, Hongyuan Zha, Molei Tao

This article considers the popular MCMC method of unadjusted Langevin Monte Carlo (LMC) and provides a non-asymptotic analysis of its sampling error in 2-Wasserstein distance.

Mean-Square Analysis with An Application to Optimal Dimension Dependence of Langevin Monte Carlo

no code implementations NeurIPS 2021 Ruilin Li, Hongyuan Zha, Molei Tao

This bound improves the best previously known $\widetilde{\mathcal{O}}\left(\frac{d}{\epsilon}\right)$ result and is optimal in both dimension $d$ and accuracy tolerance $\epsilon$ for log-smooth and log-strongly-convex target measures.

Hessian-Free High-Resolution Nesterov Acceleration for Sampling

no code implementations16 Jun 2020 Ruilin Li, Hongyuan Zha, Molei Tao

Nesterov's Accelerated Gradient (NAG) for optimization has better performance than its continuous time limit (noiseless kinetic Langevin) when a finite step-size is employed \citep{shi2021understanding}.

Vocal Bursts Intensity Prediction

Improving Sampling Accuracy of Stochastic Gradient MCMC Methods via Non-uniform Subsampling of Gradients

no code implementations20 Feb 2020 Ruilin Li, Xin Wang, Hongyuan Zha, Molei Tao

In our practical implementation of EWSG, the non-uniform subsampling is performed efficiently via a Metropolis-Hastings chain on the data index, which is coupled to the MCMC algorithm.

Computational Efficiency

Learning to Match via Inverse Optimal Transport

no code implementations10 Feb 2018 Ruilin Li, Xiaojing Ye, Haomin Zhou, Hongyuan Zha

We emphasize that the discrete optimal transport plays the role of a variational principle which gives rise to an optimization-based framework for modeling the observed empirical matching data.

Efficient fetal-maternal ECG signal separation from two channel maternal abdominal ECG via diffusion-based channel selection

no code implementations7 Feb 2017 Ruilin Li, Martin G. Frasch, Hau-Tieng Wu

There is a need for affordable, widely deployable maternal-fetal ECG monitors to improve maternal and fetal health during pregnancy and delivery.

Cannot find the paper you are looking for? You can Submit a new open access paper.