no code implementations • 8 Mar 2023 • Qizhao Chen, Morgane Austern, Vasilis Syrgkanis
Estimating optimal dynamic policies from offline data is a fundamental problem in dynamic decision making.
no code implementations • 3 Jun 2022 • Qizhao Chen, Vasilis Syrgkanis, Morgane Austern
For instance, we show that the stability properties that we propose are satisfied for ensemble bagged estimators, built via sub-sampling without replacement, a popular technique in machine learning practice.
no code implementations • 18 Feb 2022 • Kevin H. Huang, Peter Orbanz, Morgane Austern
We provide results that exactly quantify how data augmentation affects the convergence rate and variance of estimates.
no code implementations • NeurIPS 2021 • Morgane Austern, Vasilis Syrgkanis
One of the most commonly used methods for forming confidence intervals is the empirical bootstrap, which is especially expedient when the limiting distribution of the estimator is unknown.
2 code implementations • 6 Jul 2021 • Andrew Davison, Morgane Austern
We prove, under the assumption that the graph is exchangeable, that the distribution of the learned embedding vectors asymptotically decouples.
no code implementations • 23 Nov 2020 • Morgane Austern, Vasilis Syrgkanis
One of the most commonly used methods for forming confidence intervals for statistical inference is the empirical bootstrap, which is especially expedient when the limiting distribution of the estimator is unknown.
1 code implementation • 27 Jun 2018 • Victor Veitch, Morgane Austern, Wenda Zhou, David M. Blei, Peter Orbanz
We solve this problem using recent ideas from graph sampling theory to (i) define an empirical risk for relational data and (ii) obtain stochastic gradients for this empirical risk that are automatically unbiased.
1 code implementation • ICLR 2019 • Wenda Zhou, Victor Veitch, Morgane Austern, Ryan P. Adams, Peter Orbanz
Our main technical result is a generalization bound for compressed networks based on the compressed size.