Recent breakthroughs in large language modeling have facilitated rigorous exploration of their application in diverse tasks related to tabular data modeling, such as prediction, tabular data synthesis, question answering, and table understanding.
Large Language Models (LLMs) trained on large volumes of data excel at various natural language tasks, but they cannot handle tasks requiring knowledge that has not been trained on previously.
Large Language Models (LLMs) can adapt to new tasks via in-context learning (ICL).
Recent advances in large language models have revolutionized many sectors, including the database industry.
Recent advances in tabular data generation have greatly enhanced synthetic data quality.
However, GNN explanation for link prediction (LP) is lacking in the literature.
Graph Neural Networks (GNNs) are currently dominating in modeling graph-structure data, while their high reliance on graph structure for inference significantly impedes them from widespread applications.
A large number of real-world graphs or networks are inherently heterogeneous, involving a diversity of node types and relation types.
Ranked #1 on Node Clustering on IMDb
We propose a new STAcked and Reconstructed Graph Convolutional Networks (STAR-GCN) architecture to learn node representations for boosting the performance in recommender systems, especially in the cold start scenario.
Keyphrase generation (KG) aims to generate a set of keyphrases given a document, which is a fundamental task in natural language processing (NLP).
We propose a new network architecture, Gated Attention Networks (GaAN), for learning on graphs.
Ranked #1 on Node Property Prediction on ogbn-proteins
However, the prior work only attends to the sentiment information and ignores the aspect-related information in the text, which may cause mismatching between the sentiment words and the aspects when an unrelated sentiment word is semantically meaningful for the given aspect.
Knowledge Tracing (KT) is a task of tracing evolving knowledge state of students with respect to one or more concepts as they engage in a sequence of learning activities.