Paper

KernelNet: A Data-Dependent Kernel Parameterization for Deep Generative Modeling

Learning with kernels is an important concept in machine learning. Standard approaches for kernel methods often use predefined kernels that require careful selection of hyperparameters. To mitigate this burden, we propose in this paper a framework to construct and learn a data-dependent kernel based on random features and implicit spectral distributions that are parameterized by deep neural networks. The constructed network (called KernelNet) can be applied to deep generative modeling in various scenarios, including two popular learning paradigms in deep generative models, MMD-GAN and implicit Variational Autoencoder (VAE). We show that our proposed kernel indeed exists in applications and is guaranteed to be positive definite. Furthermore, the induced Maximum Mean Discrepancy (MMD) can endow the continuity property in weak topology by simple regularization. Extensive experiments indicate that our proposed KernelNet consistently achieves better performance compared to related methods.

Results in Papers With Code
(↓ scroll down to see all results)