no code implementations • 1 Jan 2021 • Tae Gyoon Kang, Ho-Gyeong Kim, Min-Joong Lee, Jihyun Lee, Seongmin Ok, Hoshik Lee, Young Sang Choi
Transformers with soft attention have been widely adopted to various sequence-to-sequence tasks.
no code implementations • NeurIPS 2020 • Seongmin Ok
The idea leads to a simple and efficient graph similarity, which we name Weisfeiler-Leman similarity (WLS).