Search Results for author: Minbyul Jeong

Found 11 papers, 11 papers with code

Improving Medical Reasoning through Retrieval and Self-Reflection with Retrieval-Augmented Large Language Models

1 code implementation27 Jan 2024 Minbyul Jeong, Jiwoong Sohn, Mujeen Sung, Jaewoo Kang

To address challenges that still cannot be handled with the encoded knowledge of LLMs, various retrieval-augmented generation (RAG) methods have been developed by searching documents from the knowledge corpus and appending them unconditionally or selectively to the input of LLMs for generation.

Multiple-choice Question Answering +1

BERN2: an advanced neural biomedical named entity recognition and normalization tool

1 code implementation6 Jan 2022 Mujeen Sung, Minbyul Jeong, Yonghwa Choi, Donghyeon Kim, Jinhyuk Lee, Jaewoo Kang

In biomedical natural language processing, named entity recognition (NER) and named entity normalization (NEN) are key tasks that enable the automatic extraction of biomedical entities (e. g. diseases and drugs) from the ever-growing biomedical literature.

graph construction named-entity-recognition +2

Graph Transformer Networks: Learning Meta-path Graphs to Improve GNNs

1 code implementation11 Jun 2021 Seongjun Yun, Minbyul Jeong, Sungdong Yoo, Seunghun Lee, Sean S. Yi, Raehyun Kim, Jaewoo Kang, Hyunwoo J. Kim

Despite the success of GNNs, most existing GNNs are designed to learn node representations on the fixed and homogeneous graphs.

Node Classification

Regularization for Long Named Entity Recognition

1 code implementation15 Apr 2021 Minbyul Jeong, Jaewoo Kang

Pre-trained language models (PLMs) are used to solve NER tasks and tend to be biased toward dataset patterns such as length statistics, surface form, and skewed class distribution.

named-entity-recognition Named Entity Recognition +1

Transferability of Natural Language Inference to Biomedical Question Answering

2 code implementations1 Jul 2020 Minbyul Jeong, Mujeen Sung, Gangwoo Kim, Donghyeon Kim, Wonjin Yoon, Jaehyo Yoo, Jaewoo Kang

We observe that BioBERT trained on the NLI dataset obtains better performance on Yes/No (+5. 59%), Factoid (+0. 53%), List type (+13. 58%) questions compared to performance obtained in a previous challenge (BioASQ 7B Phase B).

Natural Language Inference Question Answering +2

Graph Transformer Networks

1 code implementation NeurIPS 2019 Seongjun Yun, Minbyul Jeong, Raehyun Kim, Jaewoo Kang, Hyunwoo J. Kim

In this paper, we propose Graph Transformer Networks (GTNs) that are capable of generating new graph structures, which involve identifying useful connections between unconnected nodes on the original graph, while learning effective node representation on the new graphs in an end-to-end fashion.

General Classification Link Prediction +2

Pre-trained Language Model for Biomedical Question Answering

3 code implementations18 Sep 2019 Wonjin Yoon, Jinhyuk Lee, Donghyeon Kim, Minbyul Jeong, Jaewoo Kang

The recent success of question answering systems is largely attributed to pre-trained language models.

Language Modelling Question Answering

HATS: A Hierarchical Graph Attention Network for Stock Movement Prediction

3 code implementations7 Aug 2019 Raehyun Kim, Chan Ho So, Minbyul Jeong, Sang-Hoon Lee, Jinkyu Kim, Jaewoo Kang

Methods that use relational data for stock market prediction have been recently proposed, but they are still in their infancy.

Graph Attention Graph Classification +2

Cannot find the paper you are looking for? You can Submit a new open access paper.