no code implementations • 29 Jun 2022 • Jinyoung Park, Seongjun Yun, Hyeonjin Park, Jaewoo Kang, Jisu Jeong, Kyung-Min Kim, Jung-Woo Ha, Hyunwoo J. Kim
Transformer-based models have recently shown success in representation learning on graph-structured data beyond natural language processing and computer vision.
1 code implementation • NeurIPS 2021 • Seongjun Yun, Seoyoon Kim, Junhyun Lee, Jaewoo Kang, Hyunwoo J. Kim
Graph Neural Networks (GNNs) have been widely applied to various fields for learning over graph-structured data.
1 code implementation • 11 Jun 2021 • Seongjun Yun, Minbyul Jeong, Sungdong Yoo, Seunghun Lee, Sean S. Yi, Raehyun Kim, Jaewoo Kang, Hyunwoo J. Kim
Despite the success of GNNs, most existing GNNs are designed to learn node representations on the fixed and homogeneous graphs.
1 code implementation • NeurIPS 2019 • Seongjun Yun, Minbyul Jeong, Raehyun Kim, Jaewoo Kang, Hyunwoo J. Kim
In this paper, we propose Graph Transformer Networks (GTNs) that are capable of generating new graph structures, which involve identifying useful connections between unconnected nodes on the original graph, while learning effective node representation on the new graphs in an end-to-end fashion.
1 code implementation • EMNLP 2018 • Jinhyuk Lee, Seongjun Yun, Hyunjae Kim, Miyoung Ko, Jaewoo Kang
Recently, open-domain question answering (QA) has been combined with machine comprehension models to find answers in a large knowledge source.