no code implementations • 22 Dec 2023 • Yuhao Chen, Chloe Wong, Hanwen Yang, Juan Aguenza, Sai Bhujangari, Benthan Vu, Xun Lei, Amisha Prasad, Manny Fluss, Eric Phuong, Minghao Liu, Raja Kumar, Vanshika Vats, James Davis
This study critically evaluates the efficacy of prompting methods in enhancing the mathematical reasoning capability of large language models (LLMs).
no code implementations • 26 Aug 2023 • Raja Kumar, Jiahao Luo, Alex Pang, James Davis
Existing methods for 3D face reconstruction from a few casually captured images employ deep learning based models along with a 3D Morphable Model(3DMM) as face geometry prior.
1 code implementation • 30 Mar 2023 • Raja Kumar
Existing conversational models are handled by a database(DB) and API based systems.
no code implementations • 1 Jan 2021 • TEJPRATAP GVSL, Raja Kumar, Pradeep NS
The Hybrid-Quantization scheme determines the sensitivity of each layer for per-tensor and per-channel quantization, and thereby generates hybrid quantized models that are $10 - 20\%$ efficient in inference time while achieving same or better accuracy as compared to per-channel quantization.
no code implementations • 26 Dec 2020 • Tej pratap GVSL, Raja Kumar
Existing quantization aware training methods attempt to compensate for the quantization loss by leveraging on training data, like most of the post-training quantization methods, and are also time consuming.