no code implementations • 5 Mar 2024 • Sahil Sidheekh, Pranuthi Tenali, Saurabh Mathur, Erik Blasch, Kristian Kersting, Sriraam Natarajan
We consider the problem of late multi-modal fusion for discriminative learning.
no code implementations • 1 Feb 2024 • Sahil Sidheekh, Sriraam Natarajan
We present a comprehensive survey of the advancements and techniques in the field of tractable probabilistic generative modeling, primarily focusing on Probabilistic Circuits (PCs).
no code implementations • 22 Mar 2022 • Sahil Sidheekh, Chris B. Dock, Tushar Jain, Radu Balan, Maneesh K. Singh
Normalizing flows provide an elegant approach to generative modeling that allows for efficient sampling and exact density evaluation of unknown data distributions.
no code implementations • 20 Jun 2021 • Aroof Aimen, Sahil Sidheekh, Narayanan C. Krishnan
The popular approaches for ML either learn a generalizable initial model or a generic parametric optimizer through episodic training.
1 code implementation • 11 May 2021 • Sahil Sidheekh, Aroof Aimen, Narayanan C. Krishnan
Finally, we validate experimentally the usefulness of proximal duality gap for monitoring and influencing GAN training.
no code implementations • 21 Jan 2021 • Aroof Aimen, Sahil Sidheekh, Vineet Madan, Narayanan C. Krishnan
Our results show a quick degradation in the performance of initialization strategies for ML (MAML, TAML, and MetaSGD), while surprisingly, approaches that use an optimization strategy (MetaLSTM) perform significantly better.
1 code implementation • 3 Jan 2021 • Sahil Sidheekh
The availability of large amounts of data and compelling computation power have made deep learning models much popular for text classification and sentiment analysis.
1 code implementation • 12 Dec 2020 • Sahil Sidheekh, Aroof Aimen, Vineet Madan, Narayanan C. Krishnan
Further, we show that our estimate, with its ability to identify model convergence/divergence, is a potential performance measure that can be used to tune the hyperparameters of a GAN.