Search Results for author: Yen Ting Lin

Found 10 papers, 5 papers with code

Using Ornstein-Uhlenbeck Process to understand Denoising Diffusion Probabilistic Model and its Noise Schedules

no code implementations29 Nov 2023 Javier E. Santos, Yen Ting Lin

The aim of this short note is to show that Denoising Diffusion Probabilistic Model DDPM, a non-homogeneous discrete-time Markov process, can be represented by a time-homogeneous continuous-time Markov process observed at non-uniformly sampled discrete times.

Denoising

Improving Estimation of the Koopman Operator with Kolmogorov-Smirnov Indicator Functions

1 code implementation9 Jun 2023 Van A. Ngo, Yen Ting Lin, Danny Perez

It has become common to perform kinetic analysis using approximate Koopman operators that transforms high-dimensional time series of observables into ranked dynamical modes.

Time Series

Blackout Diffusion: Generative Diffusion Models in Discrete-State Spaces

1 code implementation18 May 2023 Javier E Santos, Zachary R. Fox, Nicholas Lubbers, Yen Ting Lin

Generalizing from specific (Gaussian) forward processes to discrete-state processes without a variational approximation sheds light on how to interpret diffusion models, which we discuss.

Image Generation

Regression-based projection for learning Mori-Zwanzig operators

no code implementations10 May 2022 Yen Ting Lin, Yifeng Tian, Danny Perez, Daniel Livescu

We propose to adopt statistical regression as the projection operator to enable data-driven learning of the operators in the Mori--Zwanzig formalism.

regression

Gene expression noise accelerates the evolution of a biological oscillator

1 code implementation21 Mar 2022 Yen Ting Lin, Nicolas E. Buchler

Here, we show that gene expression noise counter-intuitively accelerates the evolution of a biological oscillator and, thus, can impart a benefit to living organisms.

A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks

1 code implementation21 Jan 2022 Xiaoyu Ma, Sylvain Sardy, Nick Hengartner, Nikolai Bobenko, Yen Ting Lin

To fit sparse linear associations, a LASSO sparsity inducing penalty with a single hyperparameter provably allows to recover the important features (needles) with high probability in certain regimes even if the sample size is smaller than the dimension of the input vector (haystack).

Implementation of a practical Markov chain Monte Carlo sampling algorithm in PyBioNetFit

no code implementations29 Sep 2021 Jacob Neumann, Yen Ting Lin, Abhishek Mallela, Ely F. Miller, Joshua Colvin, Abell T. Duprat1, Ye Chen, William S. Hlavacek, Richard G. Posner

Bayesian inference in biological modeling commonly relies on Markov chain Monte Carlo (MCMC) sampling of a multidimensional and non-Gaussian posterior distribution that is not analytically tractable.

Bayesian Inference Management +2

Data-driven Optimized Control of the COVID-19 Epidemics

no code implementations4 Sep 2020 Afroza Shirin, Yen Ting Lin, Francesco Sorrentino

We then introduce a time-varying control input that represents the level of social distancing imposed on the population of a given area and solve an optimal control problem with the goal of minimizing the impact of social distancing on the economy in the presence of relevant constraints, such as a desired level of suppression for the epidemics at a terminal time.

What needles do sparse neural networks find in nonlinear haystacks

no code implementations7 Jun 2020 Sylvain Sardy, Nicolas W Hengartner, Nikolai Bonenko, Yen Ting Lin

Using a sparsity inducing penalty in artificial neural networks (ANNs) avoids over-fitting, especially in situations where noise is high and the training set is small in comparison to the number of features.

Cannot find the paper you are looking for? You can Submit a new open access paper.