Search Results for author: Doohyuk Jang

Found 1 papers, 0 papers with code

PromptKD: Distilling Student-Friendly Knowledge for Generative Language Models via Prompt Tuning

no code implementations20 Feb 2024 Gyeongman Kim, Doohyuk Jang, Eunho Yang

Recent advancements in large language models (LLMs) have raised concerns about inference costs, increasing the need for research into model compression.

Instruction Following Knowledge Distillation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.