1 code implementation • 20 Jan 2023 • Jed A. Duersch
When the basis spans univariate quadratics in each parameter, feasible densities are Gaussian and the projective integral updates yield quasi-Newton variational Bayes (QNVB).
no code implementations • 27 Apr 2022 • Niladri Das, Jed A. Duersch, Thomas A. Catanach
In this paper, we address the problem of convergence of sequential variational inference filter (VIF) through the application of a robust variational objective and Hinf-norm based correction for a linear Gaussian system.
no code implementations • 16 Mar 2022 • Jed A. Duersch, Thomas A. Catanach, Niladri Das
Further, we represent belief tables using a basis that directly associates the number of nonzero parameters to the effective arity of the belief function, thus capturing a concrete relationship between logical complexity and efficient parameter representations.
no code implementations • 3 Mar 2021 • Jed A. Duersch, Thomas A. Catanach
Bayesian inference provides a uniquely rigorous approach to obtain principled justification for uncertainty in predictions, yet it is difficult to articulate suitably general prior belief in the machine learning context, where computational architectures are pure abstractions subject to frequent modifications by practitioners attempting to improve results.
no code implementations • 21 Nov 2019 • Jed A. Duersch, Thomas A. Catanach
Rather than simply gauging uncertainty, information is understood in this theory to measure change in belief.
no code implementations • 22 Aug 2018 • David Hong, Tamara G. Kolda, Jed A. Duersch
Tensor decomposition is a fundamental unsupervised machine learning method in data science, with applications including network analysis and sensor data processing.