To tackle this issue, we propose r-GAT, a relational graph attention network to learn multi-channel entity representations.
Knowledge graph embedding (KGE), aiming to embed entities and relations into low-dimensional vectors, has attracted wide attention recently.
Graph embedding (GE) methods embed nodes (and/or edges) in graph into a low-dimensional semantic space, and have shown its effectiveness in modeling multi-relational data.
While various forms of models are proposed for the link prediction task, most of them are designed based on a few known relation patterns in several well-known datasets.
Learning text representation is crucial for text classification and other language related tasks.
At Microsoft, we develop a time-series anomaly detection service which helps customers to monitor the time-series continuously and alert for potential incidents on time.