Search Results for author: Hong Ge

Found 14 papers, 7 papers with code

Beyond Intuition, a Framework for Applying GPs to Real-World Data

1 code implementation6 Jul 2023 Kenza Tazi, Jihao Andreas Lin, Ross Viljoen, Alex Gardner, ST John, Hong Ge, Richard E. Turner

Gaussian Processes (GPs) offer an attractive method for regression over small, structured and correlated datasets.

Gaussian Processes regression

Bayesian inference and neural estimation of acoustic wave propagation

no code implementations28 May 2023 Yongchao Huang, Yuhang He, Hong Ge

In this work, we introduce a novel framework which combines physics and machine learning methods to analyse acoustic signals.

Bayesian Inference Room Impulse Response (RIR)

Neural Characteristic Activation Value Analysis for Improved ReLU Network Feature Learning

1 code implementation25 May 2023 Wenlin Chen, Hong Ge

This work examines the characteristic activation values of individual ReLU units in neural networks.

Understanding Sparse Feature Updates in Deep Networks using Iterative Linearisation

no code implementations22 Nov 2022 Adrian Goldwaser, Hong Ge

However, these theoretical tools cannot fully explain finite networks as the empirical kernel changes significantly during gradient-descent-based training in contrast to infinite networks.

Gaussian Processes

Numerically Stable Sparse Gaussian Processes via Minimum Separation using Cover Trees

1 code implementation14 Oct 2022 Alexander Terenin, David R. Burt, Artem Artemev, Seth Flaxman, Mark van der Wilk, Carl Edward Rasmussen, Hong Ge

For low-dimensional tasks such as geospatial modeling, we propose an automated method for computing inducing points satisfying these conditions.

Bayesian Optimization Decision Making +1

DynamicPPL: Stan-like Speed for Dynamic Probabilistic Models

2 code implementations7 Feb 2020 Mohamed Tarek, Kai Xu, Martin Trapp, Hong Ge, Zoubin Ghahramani

Since DynamicPPL is a modular, stand-alone library, any probabilistic programming system written in Julia, such as Turing. jl, can use DynamicPPL to specify models and trace their model parameters.

Probabilistic Programming

AdvancedHMC.jl: A robust, modular and efficient implementation of advanced HMC algorithms

1 code implementation pproximateinference AABI Symposium 2019 Kai Xu, Hong Ge, Will Tebbutt, Mohamed Tarek, Martin Trapp, Zoubin Ghahramani

Stan's Hamilton Monte Carlo (HMC) has demonstrated remarkable sampling robustness and efficiency in a wide range of Bayesian inference problems through carefully crafted adaption schemes to the celebrated No-U-Turn sampler (NUTS) algorithm.

Bayesian Inference Benchmarking

Graph Tracking in Dynamic Probabilistic Programs via Source Transformations

no code implementations pproximateinference AABI Symposium 2019 Philipp Gabler, Martin Trapp, Hong Ge, Franz Pernkopf

Many modern machine learning algorithms, such as automatic differentiation (AD) and versions of approximate Bayesian inference, can be understood as a particular case of message passing on some computation graph.

BIG-bench Machine Learning Probabilistic Programming

Bayesian Learning of Sum-Product Networks

1 code implementation NeurIPS 2019 Martin Trapp, Robert Peharz, Hong Ge, Franz Pernkopf, Zoubin Ghahramani

While parameter learning in SPNs is well developed, structure learning leaves something to be desired: Even though there is a plethora of SPN structure learners, most of them are somewhat ad-hoc and based on intuition rather than a clear learning principle.

Particle Gibbs for Infinite Hidden Markov Models

no code implementations NeurIPS 2015 Nilesh Tripuraneni, Shixiang (Shane) Gu, Hong Ge, Zoubin Ghahramani

Infinite Hidden Markov Models (iHMM's) are an attractive, nonparametric generalization of the classical Hidden Markov Model which can automatically infer the number of hidden states in the system.

Dirichlet Fragmentation Processes

no code implementations16 Sep 2015 Hong Ge, Yarin Gal, Zoubin Ghahramani

In this paper, first we review the theory of random fragmentation processes [Bertoin, 2006], and a number of existing methods for modelling trees, including the popular nested Chinese restaurant process (nCRP).

Clustering

A Linear-Time Particle Gibbs Sampler for Infinite Hidden Markov Models

no code implementations3 May 2015 Nilesh Tripuraneni, Shane Gu, Hong Ge, Zoubin Ghahramani

Infinite Hidden Markov Models (iHMM's) are an attractive, nonparametric generalization of the classical Hidden Markov Model which can automatically infer the number of hidden states in the system.

Cannot find the paper you are looking for? You can Submit a new open access paper.