Search Results for author: Shu Ge

Found 2 papers, 1 papers with code

A Knowledge Distillation Approach for Sepsis Outcome Prediction from Multivariate Clinical Time Series

no code implementations16 Nov 2023 Anna Wong, Shu Ge, Nassim Oufattole, Adam Dejl, Megan Su, Ardavan Saeedi, Li-wei H. Lehman

In this work, we use knowledge distillation via constrained variational inference to distill the knowledge of a powerful "teacher" neural network model with high predictive power to train a "student" latent variable model to learn interpretable hidden state representations to achieve high predictive performance for sepsis outcome prediction.

Knowledge Distillation Time Series +1

Model-agnostic Measure of Generalization Difficulty

1 code implementation1 May 2023 Akhilan Boopathy, Kevin Liu, Jaedong Hwang, Shu Ge, Asaad Mohammedsaleh, Ila Fiete

The measure of a machine learning algorithm is the difficulty of the tasks it can perform, and sufficiently difficult tasks are critical drivers of strong machine learning models.

Inductive Bias Meta-Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.