no code implementations • 27 Nov 2023 • Yu-Chen Lin, Akhilesh Kumar, Norman Chang, Wenliang Zhang, Muhammad Zakir, Rucha Apte, Haiyang He, Chao Wang, Jyh-Shing Roger Jang
We present four main contributions to enhance the performance of Large Language Models (LLMs) in generating domain-specific code: (i) utilizing LLM-based data splitting and data renovation techniques to improve the semantic representation of embeddings' space; (ii) introducing the Chain of Density for Renovation Credibility (CoDRC), driven by LLMs, and the Adaptive Text Renovation (ATR) algorithm for assessing data renovation reliability; (iii) developing the Implicit Knowledge Expansion and Contemplation (IKEC) Prompt technique; and (iv) effectively refactoring existing scripts to generate new and high-quality scripts with LLMs.
no code implementations • 19 Jun 2023 • Rucha Apte, Sheel Nidhan, Rishikesh Ranade, Jay Pathak
In a preliminary attempt to address the problem of data scarcity in physics-based machine learning, we introduce a novel methodology for data generation in physics-based simulations.
no code implementations • 30 Aug 2019 • Rohan Akut, Sumukh Marathe, Rucha Apte, Ishan Joshi, Siddhivinayak Kulkarni
The advantages of GANs for image generation over conventional methods as well their disadvantages amongst other frameworks are presented.