1 code implementation • Scientific Reports 2024 • Michael C. Burkhart, Liz Y. Lee, Delshad Vaghari, An Qi Toh, Eddie Chong, Christopher Chen, Peter Tiňo, Zoe Kourtzi
In contrast to supervised classification approaches that require labeled data, we propose an unsupervised multimodal trajectory modeling (MTM) approach based on a mixture of state space models that captures changes in longitudinal data (i. e., trajectories) and stratifies individuals without using clinical diagnosis for model training.
1 code implementation • Journal of Computational Science 2023 • Michael C. Burkhart, Gabriel Ruiz
In contrast to previous approaches that encourage the distribution of representations to be treatment-invariant, we leverage a genetic algorithm that optimizes over representations useful for predicting the outcome to select those less useful for predicting the treatment.
no code implementations • 22 May 2022 • Michael C. Burkhart, Kyle Shan
Given a small set of labeled data and a large set of unlabeled data, semi-supervised learning (SSL) attempts to leverage the location of the unlabeled datapoints in order to create a better classifier than could be obtained from supervised methods applied to the labeled training set alone.
no code implementations • 21 May 2022 • Michael C. Burkhart, Gabriel Ruiz
Within the field of causal inference, we consider the problem of estimating heterogeneous treatment effects from data.
1 code implementation • 27 Apr 2021 • Michael C. Burkhart
To minimize the average of a set of log-convex functions, the stochastic Newton method iteratively updates its estimate using subsampled versions of the full objective's gradient and Hessian.
1 code implementation • 1 May 2020 • Michael C. Burkhart, David M. Brandman, Brian Franco, Leigh R. Hochberg, Matthew T. Harrison
Extensions to the Kalman filter, including the extended and unscented Kalman filters, incorporate linearizations for models where the observation model p(observation∣state) is nonlinear.
1 code implementation • 17 Jul 2018 • Michael C. Burkhart
Given a stationary state-space model that relates a sequence of hidden states and corresponding measurements or observations, Bayesian filtering provides a principled statistical framework for inferring the posterior distribution of the current state given all measurements up to the present time.
no code implementations • 23 Aug 2016 • Michael C. Burkhart, David M. Brandman, Carlos E. Vargas-Irwin, Matthew T. Harrison
The Kalman filter (KF) is used in a variety of applications for computing the posterior distribution of latent states in a state space model.