no code implementations • IEEE/ACM Transactions on Audio, Speech, and Language Processing 2021 • Zhiwen Xie, Runjie Zhu, Jin Liu, Guangyou Zhou, and Jimmy Xiangji Huang
Abstract—The graph attention network (GAT) [1] has started to become a mainstream neural network architecture since 2018, yielding remarkable performance gains in various natural language processing (NLP) tasks.