1 code implementation • 12 Oct 2023 • Seungone Kim, Jamin Shin, Yejin Cho, Joel Jang, Shayne Longpre, Hwaran Lee, Sangdoo Yun, Seongjin Shin, Sungdong Kim, James Thorne, Minjoon Seo
We first construct the Feedback Collection, a new dataset that consists of 1K fine-grained score rubrics, 20K instructions, and 100K responses and language feedback generated by GPT-4.
no code implementations • NAACL 2022 • Seongjin Shin, Sang-Woo Lee, Hwijeen Ahn, Sungdong Kim, HyoungSeok Kim, Boseop Kim, Kyunghyun Cho, Gichang Lee, WooMyoung Park, Jung-Woo Ha, Nako Sung
Many recent studies on large-scale language models have reported successful in-context zero- and few-shot learning ability.
1 code implementation • 25 Oct 2020 • Seongbin Kim, Gyuwan Kim, Seongjin Shin, Sangmin Lee
End-to-end approaches open a new way for more accurate and efficient spoken language understanding (SLU) systems by alleviating the drawbacks of traditional pipeline systems.
Ranked #3 on Spoken Language Understanding on Fluent Speech Commands (using extra training data)
Automatic Speech Recognition Automatic Speech Recognition (ASR) +4