no code implementations • 26 Nov 2022 • Chandrajit Bajaj, Omatharv Bharat Vaidya, Yi Wang
Task learning in neural networks typically requires finding a globally optimal minimizer to a loss function objective.
no code implementations • 8 May 2022 • Omatharv Bharat Vaidya, Rithvik Terence DSouza, Snehanshu Saha, Soma Dhavala, Swagatam Das
We introduce the Hamiltonian Monte Carlo Particle Swarm Optimizer (HMC-PSO), an optimization algorithm that reaps the benefits of both Exponentially Averaged Momentum PSO and HMC sampling.
no code implementations • 10 Apr 2021 • Urvil Nileshbhai Jivani, Omatharv Bharat Vaidya, Anwesh Bhattacharya, Snehanshu Saha
This paper introduces application of the Exponentially Averaged Momentum Particle Swarm Optimization (EM-PSO) as a derivative-free optimizer for Neural Networks.