Clustering is a widely used unsupervised learning technique involving an intensive discrete optimization problem.
However, the dynamic (i. e., input-dependent) nature of these pathways makes it difficult to prune dense self-attention during training.
Ranked #10 on Graph Regression on PCQM4Mv2-LSC
Visual document classifiers have shown impressive performance on in-distribution test sets.
Our work combines aspects of three promising paradigms in machine learning, namely, attention mechanism, energy-based models, and associative memory.
The network embedding task is to represent the node in the network as a low-dimensional vector while incorporating the topological and structural information.
We examine recurrent, convolutional, and Transformer-based encoder-decoder models to automatically generate natural language summaries from numeric temporal personal health data.
It is therefore of great interest to learn predictive models from these long textual documents, especially for forecasting numerical key performance indicators (KPIs).
Keyphrase extraction is the task of finding several interesting phrases in a text document, which provide a list of the main topics within the document.
The resultant framework - which we call Edge-augmented Graph Transformer (EGT) - can directly accept, process and output structural information of arbitrary form, which is important for effective learning on graph-structured data.
Ranked #1 on Graph Regression on PCQM4Mv2-LSC
The information is extracted and stored in a structured format using knowledge graphs such that the semantics of the threat intelligence can be preserved and shared at scale with other security analysts.
In this work we study a mathematical formalization of this network motif and apply it to learning the correlational structure between words and their context in a corpus of unstructured text, a common natural language processing (NLP) task.
Food recommendation has become an important means to help guide users to adopt healthy dietary habits.
In this paper, we propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL), for jointly and iteratively learning graph structure and graph embedding.
The knowledge graph that uses MALOnt is instantiated from a corpus comprising hundreds of annotated malware threat reports.
In this work, we focus on a more realistic setting where we aim to generate questions from a KG subgraph and target answers.
Ranked #3 on KG-to-Text Generation on WebQuestions
Whereas it has become easier for individuals to track their personal health data (e. g., heart rate, step count, food log), there is still a wide chasm between the collection of data and the generation of meaningful explanations to help users better understand what their data means to them.
In this paper, we propose an end-to-end graph learning framework, namely Deep Iterative and Adaptive Learning for Graph Neural Networks (DIAL-GNN), for jointly learning the graph structure and graph embeddings simultaneously.
Natural question generation (QG) aims to generate questions from a passage and an answer.
In this paper, we propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL), for jointly learning graph structure and graph embedding simultaneously.
The proposed GraphFlow model can effectively capture conversational flow in a dialog, and shows competitive performance compared to existing state-of-the-art methods on CoQA, QuAC and DoQA benchmarks.
When answering natural language questions over knowledge bases (KBs), different question components and KB aspects play different roles.
However, these platforms do not represent a good match for distributed graph mining problems, as for example finding frequent subgraphs in a graph.
Distributed, Parallel, and Cluster Computing