no code implementations • COLING (PEOPLES) 2020 • Jonggu Kim, Hyeonmok Ko, Seoha Song, Saebom Jang, Jiyeon Hong
We firstly use ELECTRA which is a state-of-the-art pretrained language model and validate the performance on emotion recognition in conversations.
no code implementations • 28 Oct 2019 • Jonggu Kim, Jong-Hyeok Lee
We propose two methods to capture relevant history information in a multi-turn dialogue by modeling inter-speaker relationship for spoken language understanding (SLU).
1 code implementation • NAACL 2019 • Jonggu Kim, Jong-Hyeok Lee
To capture salient contextual information for spoken language understanding (SLU) of a dialogue, we propose time-aware models that automatically learn the latent time-decay function of the history without a manual time-decay function.
no code implementations • 23 May 2018 • Jonggu Kim, Doyeon Kong, Jong-Hyeok Lee
Using a sequence-to-sequence framework, many neural conversation models for chit-chat succeed in naturalness of the response.
no code implementations • 5 Jul 2017 • Jonggu Kim, Jong-Hyeok Lee
Most of neural approaches to relation classification have focused on finding short patterns that represent the semantic relation using Convolutional Neural Networks (CNNs) and those approaches have generally achieved better performances than using Recurrent Neural Networks (RNNs).