Search Results for author: Nakyeong Yang

Found 6 papers, 0 papers with code

SKIP: Skill-Localized Prompt Tuning for Inference Speed Boost-Up

no code implementations18 Apr 2024 Nakyeong Yang, Junseok Kim, Jiwon Moon, Yunah Jang, Kyomin Jung

Prompt-tuning methods have shown comparable performance as parameter-efficient fine-tuning (PEFT) methods in various natural language understanding tasks.

Language Modelling Natural Language Understanding

CRISPR: Eliminating Bias Neurons from an Instruction-following Language Model

no code implementations16 Nov 2023 Nakyeong Yang, Taegwan Kang, Kyomin Jung

Large language models (LLMs) executing tasks through instruction-based prompts often face challenges stemming from distribution differences between user instructions and training instructions.

Instruction Following Language Modelling

Is it Really Negative? Evaluating Natural Language Video Localization Performance on Multiple Reliable Videos Pool

no code implementations15 Aug 2023 Nakyeong Yang, Minsung Kim, Seunghyun Yoon, Joongbo Shin, Kyomin Jung

With the explosion of multimedia content in recent years, Video Corpus Moment Retrieval (VCMR), which aims to detect a video moment that matches a given natural language query from multiple videos, has become a critical problem.

Contrastive Learning Moment Retrieval +3

Multi-View Zero-Shot Open Intent Induction from Dialogues: Multi Domain Batch and Proxy Gradient Transfer

no code implementations23 Mar 2023 Hyukhun Koh, Haesung Pyun, Nakyeong Yang, Kyomin Jung

In Task Oriented Dialogue (TOD) system, detecting and inducing new intents are two main challenges to apply the system in the real world.

Task-specific Compression for Multi-task Language Models using Attribution-based Pruning

no code implementations9 May 2022 Nakyeong Yang, Yunah Jang, Hwanhee Lee, Seohyeong Jung, Kyomin Jung

However, these language models utilize an unnecessarily large number of model parameters, even when used only for a specific task.

Natural Language Understanding

Cannot find the paper you are looking for? You can Submit a new open access paper.