no code implementations • 14 Oct 2022 • Hyunjae Kim, Jaehyo Yoo, Seunghyun Yoon, Jaewoo Kang
Most weakly supervised named entity recognition (NER) models rely on domain-specific dictionaries provided by experts.
no code implementations • 24 May 2022 • Jaehyo Yoo, Jaewoo Kang
While the most train sentences are created via automatic techniques such as crawling and sentence-alignment methods, the test sentences are annotated with the consideration of fluency by human.
1 code implementation • 16 Dec 2021 • Hyunjae Kim, Jaehyo Yoo, Seunghyun Yoon, Jinhyuk Lee, Jaewoo Kang
Recent named entity recognition (NER) models often rely on human-annotated datasets, requiring the significant engagement of professional knowledge on the target domain and entities.
2 code implementations • 1 Jul 2020 • Minbyul Jeong, Mujeen Sung, Gangwoo Kim, Donghyeon Kim, Wonjin Yoon, Jaehyo Yoo, Jaewoo Kang
We observe that BioBERT trained on the NLI dataset obtains better performance on Yes/No (+5. 59%), Factoid (+0. 53%), List type (+13. 58%) questions compared to performance obtained in a previous challenge (BioASQ 7B Phase B).
no code implementations • 13 Sep 2019 • Cheonbok Park, Inyoup Na, Yongjang Jo, Sungbok Shin, Jaehyo Yoo, Bum Chul Kwon, Jian Zhao, Hyungjong Noh, Yeonsoo Lee, Jaegul Choo
Attention networks, a deep neural network architecture inspired by humans' attention mechanism, have seen significant success in image captioning, machine translation, and many other applications.