1 code implementation • Applied Sciences 2022 • HeeSeung Jung, Kangil Kim, Jong-Hun Shin, Seung-Hoon Na, SangKeun Jung, Sangmin Woo
Most neural machine translation models are implemented as a conditional language model framework composed of encoder and decoder models.
no code implementations • 12 Aug 2021 • HeeSeung Jung, Kangil Kim, Hoyong Kim, Jong-Hun Shin
The flexibility of decision boundaries in neural networks that are unguided by training data is a well-known problem typically resolved with generalization methods.
no code implementations • 8 May 2020 • Youngduck Choi, Yoonho Na, Youngjik Yoon, Jong-Hun Shin, Chan Bae, Hongseok Suh, Byung-soo Kim, Jaewe Heo
Finally, Rocket provides students with fine-grained information on their learning path, giving them an avenue to assess their own skills and track their learning progress.
no code implementations • CONLL 2019 • Seung-Hoon Na, Jinwoon Min, Kwanghyeon Park, Jong-Hun Shin, Young-Kil Kim
We propose a unified parsing model using biaffine attention (Dozat and Manning, 2017), consisting of 1) a BERT-BiLSTM encoder and 2) a biaffine attention decoder.
no code implementations • IJCNLP 2017 • Kangil Kim, Jong-Hun Shin, Seung-Hoon Na, SangKeun Jung
Neural machine translation decoders are usually conditional language models to sequentially generate words for target sentences.
no code implementations • LREC 2018 • Gyu-Hyeon Choi, Jong-Hun Shin, Young-Kil Kim
We found that the corpus extension could also improve the performance of multi-source neural machine translation.