Search Results for author: Surajsinh Parmar

Found 5 papers, 0 papers with code

Beyond LoRA: Exploring Efficient Fine-Tuning Techniques for Time Series Foundational Models

no code implementations17 Sep 2024 Divij Gupta, Anubhav Bhatti, Surajsinh Parmar

Time Series Foundation Models (TSFMs) have recently garnered attention for their ability to model complex, large-scale time series data across domains such as retail, finance, and transportation.

Domain Adaptation parameter-efficient fine-tuning +1

Towards Democratizing Multilingual Large Language Models For Medicine Through A Two-Stage Instruction Fine-tuning Approach

no code implementations9 Sep 2024 Meng Zhou, Surajsinh Parmar, Anubhav Bhatti

Open-source, multilingual medical large language models (LLMs) have the potential to serve linguistically diverse populations across different regions.

Computational Efficiency Continual Pretraining +1

SM70: A Large Language Model for Medical Devices

no code implementations12 Dec 2023 Anubhav Bhatti, Surajsinh Parmar, San Lee

We are introducing SM70, a 70 billion-parameter Large Language Model that is specifically designed for SpassMed's medical devices under the brand name 'JEE1' (pronounced as G1 and means 'Life').

Decision Making Information Retrieval +3

Cannot find the paper you are looking for? You can Submit a new open access paper.