1 code implementation • 2 Jun 2023 • Andrew Jesson, Chris Lu, Gunshi Gupta, Angelos Filos, Jakob Nicolaus Foerster, Yarin Gal
We show that the additive term is bounded proportional to the Lipschitz constant of the value function, which offers theoretical grounding for spectral normalization of critic weights.
2 code implementations • NeurIPS 2020 • Gunshi Gupta, Karmesh Yadav, Liam Paull
The continual learning problem involves training models with limited capacity to perform well on a set of an unknown number of sequentially arriving tasks.
3 code implementations • ICML Workshop LifelongML 2020 • Gunshi Gupta, Karmesh Yadav, Liam Paull
The continual learning problem involves training models with limited capacity to perform well on a set of an unknown number of sequentially arriving tasks.
1 code implementation • 23 Oct 2019 • Sanjay Thakur, Herke van Hoof, Gunshi Gupta, David Meger
PAC Bayes is a generalized framework which is more resistant to overfitting and that yields performance bounds that hold with arbitrarily high probability even on the unjustified extrapolations.
no code implementations • 11 Apr 2018 • Ganesh Iyer, J. Krishna Murthy, Gunshi Gupta, K. Madhava Krishna, Liam Paull
We show that using a noisy teacher, which could be a standard VO pipeline, and by designing a loss term that enforces geometric consistency of the trajectory, we can train accurate deep models for VO that do not require ground-truth labels.