no code implementations • 2 Apr 2024 • Ying Li, Zhidi Lin, Feng Yin, Michael Minyi Zhang
Gaussian process latent variable models (GPLVMs) are a versatile family of unsupervised learning models, commonly used for dimensionality reduction.
no code implementations • 1 Nov 2023 • Taole Sha, Michael Minyi Zhang
Time-dependent data often exhibit characteristics, such as non-stationarity and heavy-tailed errors, that would be inappropriate to model with the typical assumptions used in popular models.
no code implementations • 27 Aug 2023 • Forough Fazeli-Asl, Michael Minyi Zhang
Generative models have emerged as a promising technique for producing high-quality images that are indistinguishable from real images.
1 code implementation • 14 Jun 2023 • Michael Minyi Zhang, Gregory W. Gundersen, Barbara E. Engelhardt
The Gaussian process latent variable model (GPLVM) is a popular probabilistic method used for nonlinear dimension reduction, matrix factorization, and state-space modeling.
no code implementations • 5 Mar 2023 • Forough Fazeli-Asl, Michael Minyi Zhang, Lizhen Lin
Bayesian methods for GOF can be appealing due to their ability to incorporate expert knowledge through prior distributions.
no code implementations • 20 May 2022 • Michael Minyi Zhang
We propose a non-linear, Bayesian non-parametric latent variable model where the latent space is assumed to be sparse and infinite dimensional a priori using an Indian buffet process prior.
no code implementations • 18 Oct 2020 • Lizhen Lin, Bayan Saparbayeva, Michael Minyi Zhang, David B. Dunson
One of the key challenges for optimization on manifolds is the difficulty of verifying the complexity of the objective function, e. g., whether the objective function is convex or non-convex, and the degree of non-convexity.
2 code implementations • 19 Jun 2020 • Gregory W. Gundersen, Michael Minyi Zhang, Barbara E. Engelhardt
By approximating a nonlinear relationship between the latent space and the observations with a function that is linear with respect to random features, we induce closed-form gradients of the posterior distribution with respect to the latent variable.
no code implementations • 15 Jan 2020 • Avinava Dubey, Michael Minyi Zhang, Eric P. Xing, Sinead A. Williamson
Bayesian nonparametric (BNP) models provide elegant methods for discovering underlying latent features within a data set, but inference in such models can be slow.
no code implementations • 15 Oct 2019 • Fernando Perez-Cruz, Pablo M. Olmos, Michael Minyi Zhang, Howard Huang
In this paper, we take a new approach for time of arrival geo-localization.
1 code implementation • 24 May 2019 • Michael Minyi Zhang, Bianca Dumitrascu, Sinead A. Williamson, Barbara E. Engelhardt
Many machine learning problems can be framed in the context of estimating functions, and often these are time-dependent functions that are estimated in real-time as observations arrive.
no code implementations • 18 Apr 2019 • Sinead A. Williamson, Michael Minyi Zhang, Paul Damien
These random, observed responses are typically affected by many unobserved, latent factors (or features) within the building such as the number of individuals, the turning on and off of electrical devices, power surges, etc.
no code implementations • NeurIPS 2018 • Bayan Saparbayeva, Michael Minyi Zhang, Lizhen Lin
Our work aims to fill a critical gap in the literature by generalizing parallel inference algorithms to optimization on manifolds.
no code implementations • 19 May 2017 • Michael Minyi Zhang, Sinead A. Williamson, Fernando Perez-Cruz
First, we introduce an accelerated feature proposal mechanism that we show is a valid MCMC algorithm for posterior inference.
1 code implementation • 27 Feb 2017 • Michael Minyi Zhang, Sinead A. Williamson
Training Gaussian process-based models typically involves an $ O(N^3)$ computational bottleneck due to inverting the covariance matrix.
no code implementations • 19 Oct 2016 • Michael Minyi Zhang, Henry Lam, Lizhen Lin
Effective and accurate model selection is an important problem in modern data analysis.