no code implementations • 12 Feb 2024 • Peter Orbanz
We also explain connections to the Hunt-Stein theorem on invariant tests.
no code implementations • 8 Jun 2023 • Ryan P. Adams, Peter Orbanz
The linear representation generalizes the Fourier basis to crystallographically invariant basis functions.
1 code implementation • 18 Feb 2022 • Kevin Han Huang, Peter Orbanz, Morgane Austern
We provide results that exactly quantify how data augmentation affects the variance and limiting distribution of estimates, and analyze several specific models in detail.
1 code implementation • 27 Jun 2018 • Victor Veitch, Morgane Austern, Wenda Zhou, David M. Blei, Peter Orbanz
We solve this problem using recent ideas from graph sampling theory to (i) define an empirical risk for relational data and (ii) obtain stochastic gradients for this empirical risk that are automatically unbiased.
1 code implementation • ICLR 2019 • Wenda Zhou, Victor Veitch, Morgane Austern, Ryan P. Adams, Peter Orbanz
Our main technical result is a generalization bound for compressed networks based on the compressed size.
1 code implementation • 19 Dec 2016 • Benjamin Bloem-Reddy, Peter Orbanz
We introduce a class of generative network models that insert edges by connecting the starting and terminal vertices of a random walk on the network graph.
no code implementations • 30 Dec 2013 • Peter Orbanz, Daniel M. Roy
The natural habitat of most Bayesian methods is data represented by exchangeable sequences of observations, for which de Finetti's theorem provides the theoretical foundation.
no code implementations • NeurIPS 2009 • Peter Orbanz
We consider the general problem of constructing nonparametric Bayesian models on infinite-dimensional random objects, such as functions, infinite graphs or infinite permutations.