The more balanced labels increase minority class performance, which in turn allows the model to outperform the previous baseline by 0. 6, 1. 7, and 2. 4 mIoU for budgets of 5%, 10%, and 20%, respectively.
no code implementations • 5 Jul 2023 • Marshall Davey, Charles Puelz, Simone Rossi, Margaret Anne Smith, David R. Wells, Greg Sturgeon, W. Paul Segars, John P. Vavalle, Charles S. Peskin, Boyce E. Griffith
Here we introduce and benchmark a comprehensive mathematical model of cardiac fluid dynamics in the human heart.
We introduce Functional Diffusion Processes (FDPs), which generalize score-based diffusion models to infinite-dimensional function spaces.
Ranked #21 on Image Generation on CelebA 64x64
Score-based diffusion models are a class of generative models whose dynamics is described by stochastic differential equations that map noise into data.
We develop a novel method for carrying out model selection for Bayesian autoencoders (BAEs) by means of prior hyper-parameter optimization.
This poses a challenge because modern neural networks are characterized by a large number of parameters, and the choice of these priors has an uncontrolled effect on the induced functional prior, which is the distribution of the functions obtained by sampling the parameters from their prior distribution.
The Bayesian treatment of neural networks dictates that a prior distribution is considered over the weight and bias parameters of the network.
Variational inference techniques based on inducing variables provide an elegant framework for scalable posterior estimation in Gaussian process (GP) models.
Variational inference offers scalable and flexible tools to tackle intractable Bayesian inference of modern statistical models like Bayesian neural networks and Gaussian processes.
Over-parameterized models, such as DeepNets and ConvNets, form a class of models that are routinely adopted in a wide variety of applications, and for which Bayesian inference is desirable but extremely challenging.
Stochastic variational inference is an established way to carry out approximate Bayesian inference for deep models.