no code implementations • 18 Jun 2019 • Bolton Bailey, Ziwei Ji, Matus Telgarsky, Ruicheng Xian
This paper investigates the approximation power of three types of random neural networks: (a) infinite width networks, with weights following an arbitrary distribution; (b) finite width networks obtained by subsampling the preceding infinite width networks; (c) finite width networks obtained by starting with standard Gaussian initialization, and then adding a vanishingly small correction to the weights.
no code implementations • 8 Jun 2019 • Yu-cheng Chen, Matus Telgarsky, Chao Zhang, Bolton Bailey, Daniel Hsu, Jian Peng
This paper provides a simple procedure to fit generative networks to target distributions, with the goal of a small Wasserstein distance (or other optimal transport costs).
no code implementations • NeurIPS 2018 • Bolton Bailey, Matus Telgarsky
This paper investigates the ability of generative networks to convert their input noise distributions into other distributions.