Search Results for author: Meng Qu

Found 43 papers, 26 papers with code

Few-shot Relation Extraction via Bayesian Meta-learning on Task Graphs

no code implementations ICML 2020 Meng Qu, Tianyu Gao, Louis-Pascal Xhonneux, Jian Tang

This paper studies few-shot relation extraction, which aims at predicting the relation for a pair of entities in a sentence by training with a few labeled examples in each relation.

Graph Neural Network Meta-Learning +3

Prioritizing Safeguarding Over Autonomy: Risks of LLM Agents for Science

no code implementations6 Feb 2024 Xiangru Tang, Qiao Jin, Kunlun Zhu, Tongxin Yuan, Yichi Zhang, Wangchunshu Zhou, Meng Qu, Yilun Zhao, Jian Tang, Zhuosheng Zhang, Arman Cohan, Zhiyong Lu, Mark Gerstein

Intelligent agents powered by large language models (LLMs) have demonstrated substantial promise in autonomously conducting experiments and facilitating scientific discoveries across various disciplines.

GraphText: Graph Reasoning in Text Space

1 code implementation2 Oct 2023 Jianan Zhao, Le Zhuo, Yikang Shen, Meng Qu, Kai Liu, Michael Bronstein, Zhaocheng Zhu, Jian Tang

Furthermore, GraphText paves the way for interactive graph reasoning, allowing both humans and LLMs to communicate with the model seamlessly using natural language.

In-Context Learning Text Generation

TGNN: A Joint Semi-supervised Framework for Graph-level Classification

no code implementations23 Apr 2023 Wei Ju, Xiao Luo, Meng Qu, Yifan Wang, Chong Chen, Minghua Deng, Xian-Sheng Hua, Ming Zhang

The two twin modules collaborate with each other by exchanging instance similarity knowledge to fully explore the structure information of both labeled and unlabeled data.

Graph Classification Graph Neural Network

Learning on Large-scale Text-attributed Graphs via Variational Inference

2 code implementations26 Oct 2022 Jianan Zhao, Meng Qu, Chaozhuo Li, Hao Yan, Qian Liu, Rui Li, Xing Xie, Jian Tang

In this paper, we propose an efficient and effective solution to learning on large text-attributed graphs by fusing graph structure and language learning with a variational Expectation-Maximization (EM) framework, called GLEM.

Variational Inference

KGNN: Harnessing Kernel-based Networks for Semi-supervised Graph Classification

no code implementations21 May 2022 Wei Ju, Junwei Yang, Meng Qu, Weiping Song, Jianhao Shen, Ming Zhang

This problem is typically solved by using graph neural networks (GNNs), which yet rely on a large number of labeled graphs for training and are unable to leverage unlabeled graphs.

Graph Classification Graph Neural Network

Neural Structured Prediction for Inductive Node Classification

1 code implementation ICLR 2022 Meng Qu, Huiyu Cai, Jian Tang

This problem has been extensively studied with graph neural networks (GNNs) by learning effective node representations, as well as traditional structured prediction methods for modeling the structured output of node labels, e. g., conditional random fields (CRFs).

Classification Node Classification +1

Structured Multi-task Learning for Molecular Property Prediction

1 code implementation22 Feb 2022 Shengchao Liu, Meng Qu, Zuobai Zhang, Huiyu Cai, Jian Tang

However, in contrast to other domains, the performance of multi-task learning in drug discovery is still not satisfying as the number of labeled data for each task is too limited, which calls for additional data to complement the data scarcity.

Drug Discovery Graph Neural Network +5

TorchDrug: A Powerful and Flexible Machine Learning Platform for Drug Discovery

1 code implementation16 Feb 2022 Zhaocheng Zhu, Chence Shi, Zuobai Zhang, Shengchao Liu, Minghao Xu, Xinyu Yuan, Yangtian Zhang, Junkun Chen, Huiyu Cai, Jiarui Lu, Chang Ma, Runcheng Liu, Louis-Pascal Xhonneux, Meng Qu, Jian Tang

However, lacking domain knowledge (e. g., which tasks to work on), standard benchmarks and data preprocessing pipelines are the main obstacles for machine learning researchers to work in this domain.

BIG-bench Machine Learning Drug Discovery +2

Joint Modeling of Visual Objects and Relations for Scene Graph Generation

no code implementations NeurIPS 2021 Minghao Xu, Meng Qu, Bingbing Ni, Jian Tang

We further propose an efficient and effective algorithm for inference based on mean-field variational inference, in which we first provide a warm initialization by independently predicting the objects and their relations according to the current model, followed by a few iterations of relational reasoning.

Graph Generation Knowledge Graph Embedding +5

Multi-task Learning with Domain Knowledge for Molecular Property Prediction

no code implementations NeurIPS Workshop AI4Scien 2021 Shengchao Liu, Meng Qu, Zuobai Zhang, Huiyu Cai, Jian Tang

In this paper, we study multi-task learning for molecule property prediction in a different setting, where a relation graph between different tasks is available.

Drug Discovery Molecular Property Prediction +4

Predicting Infectiousness for Proactive Contact Tracing

1 code implementation ICLR 2021 Yoshua Bengio, Prateek Gupta, Tegan Maharaj, Nasim Rahaman, Martin Weiss, Tristan Deleu, Eilif Muller, Meng Qu, Victor Schmidt, Pierre-Luc St-Charles, Hannah Alsdurf, Olexa Bilanuik, David Buckeridge, Gáetan Marceau Caron, Pierre-Luc Carrier, Joumana Ghosn, Satya Ortiz-Gagne, Chris Pal, Irina Rish, Bernhard Schölkopf, Abhinav Sharma, Jian Tang, Andrew Williams

Predictions are used to provide personalized recommendations to the individual via an app, as well as to send anonymized messages to the individual's contacts, who use this information to better predict their own infectiousness, an approach we call proactive contact tracing (PCT).

RNNLogic: Learning Logic Rules for Reasoning on Knowledge Graphs

2 code implementations ICLR 2021 Meng Qu, Junkun Chen, Louis-Pascal Xhonneux, Yoshua Bengio, Jian Tang

Then in the E-step, we select a set of high-quality rules from all generated rules with both the rule generator and reasoning predictor via posterior inference; and in the M-step, the rule generator is updated with the rules selected in the E-step.

Knowledge Graphs

Few-shot Relation Extraction via Bayesian Meta-learning on Relation Graphs

1 code implementation5 Jul 2020 Meng Qu, Tianyu Gao, Louis-Pascal A. C. Xhonneux, Jian Tang

To more effectively generalize to new relations, in this paper we study the relationships between different relations and propose to leverage a global relation graph.

Graph Neural Network Meta-Learning +3

Graph Policy Network for Transferable Active Learning on Graphs

1 code implementation NeurIPS 2020 Shengding Hu, Zheng Xiong, Meng Qu, Xingdi Yuan, Marc-Alexandre Côté, Zhiyuan Liu, Jian Tang

Graph neural networks (GNNs) have been attracting increasing popularity due to their simplicity and effectiveness in a variety of fields.

Active Learning

Continuous Graph Neural Networks

1 code implementation ICML 2020 Louis-Pascal A. C. Xhonneux, Meng Qu, Jian Tang

The key idea is how to characterise the continuous dynamics of node representations, i. e. the derivatives of node representations, w. r. t.

Node Classification

Recurrent Event Network : Global Structure Inference Over Temporal Knowledge Graph

no code implementations25 Sep 2019 Woojeong Jin, He Jiang, Meng Qu, Tong Chen, Changlin Zhang, Pedro Szekely, Xiang Ren

We present Recurrent Event Network (RE-Net), a novel autoregressive architecture for modeling temporal sequences of multi-relational graphs (e. g., temporal knowledge graph), which can perform sequential, global structure inference over future time stamps to predict new events.

Link Prediction Temporal Sequences

Transfer Active Learning For Graph Neural Networks

no code implementations25 Sep 2019 Shengding Hu, Meng Qu, Zhiyuan Liu, Jian Tang

Moreover, we also studied how to learn a universal policy for labeling nodes on graphs with multiple training graphs and then transfer the learned policy to unseen graphs.

Active Learning Node Classification +1

GraphMix: Regularized Training of Graph Neural Networks for Semi-Supervised Learning

no code implementations25 Sep 2019 Vikas Verma, Meng Qu, Alex Lamb, Yoshua Bengio, Juho Kannala, Jian Tang

We present GraphMix, a regularization technique for Graph Neural Network based semi-supervised object classification, leveraging the recent advances in the regularization of classical deep neural networks.

Graph Neural Network

GraphMix: Improved Training of GNNs for Semi-Supervised Learning

1 code implementation25 Sep 2019 Vikas Verma, Meng Qu, Kenji Kawaguchi, Alex Lamb, Yoshua Bengio, Juho Kannala, Jian Tang

We present GraphMix, a regularization method for Graph Neural Network based semi-supervised object classification, whereby we propose to train a fully-connected network jointly with the graph neural network via parameter sharing and interpolation-based regularization.

Generalization Bounds Graph Attention +2

Collaborative Policy Learning for Open Knowledge Graph Reasoning

2 code implementations IJCNLP 2019 Cong Fu, Tong Chen, Meng Qu, Woojeong Jin, Xiang Ren

We propose a novel reinforcement learning framework to train two collaborative agents jointly, i. e., a multi-hop graph reasoner and a fact extractor.

Reinforcement Learning

Weakly-supervised Knowledge Graph Alignment with Adversarial Learning

no code implementations ICLR 2019 Meng Qu, Jian Tang, Yoshua Bengio

Therefore, in this paper we propose to study aligning knowledge graphs in fully-unsupervised or weakly-supervised fashion, i. e., without or with only a few aligned triplets.

Knowledge Graphs

Probabilistic Logic Neural Networks for Reasoning

2 code implementations NeurIPS 2019 Meng Qu, Jian Tang

In the E-step, a knowledge graph embedding model is used for inferring the missing triplets, while in the M-step, the weights of logic rules are updated based on both the observed and predicted triplets.

Knowledge Graph Embedding Knowledge Graphs

vGraph: A Generative Model for Joint Community Detection and Node Representation Learning

1 code implementation NeurIPS 2019 Fan-Yun Sun, Meng Qu, Jordan Hoffmann, Chin-wei Huang, Jian Tang

Experimental results on multiple real-world graphs show that vGraph is very effective in both community detection and node representation learning, outperforming many competitive baselines in both tasks.

Community Detection Representation Learning +1

GMNN: Graph Markov Neural Networks

1 code implementation15 May 2019 Meng Qu, Yoshua Bengio, Jian Tang

Statistical relational learning methods can effectively model the dependency of object labels through conditional random fields for collective classification, whereas graph neural networks learn effective object representations for classification through end-to-end training.

Classification General Classification +4

Recurrent Event Network: Autoregressive Structure Inference over Temporal Knowledge Graphs

2 code implementations11 Apr 2019 Woojeong Jin, Meng Qu, Xisen Jin, Xiang Ren

The task becomes more challenging on temporal knowledge graphs, where each fact is associated with a timestamp.

Knowledge Graphs Link Prediction +1

GraphVite: A High-Performance CPU-GPU Hybrid System for Node Embedding

1 code implementation2 Mar 2019 Zhaocheng Zhu, Shizhen Xu, Meng Qu, Jian Tang

In this paper, we propose GraphVite, a high-performance CPU-GPU hybrid system for training node embeddings, by co-optimizing the algorithm and the system.

Dimensionality Reduction Knowledge Graph Embedding +4

Learning Dual Retrieval Module for Semi-supervised Relation Extraction

1 code implementation20 Feb 2019 Hongtao Lin, Jun Yan, Meng Qu, Xiang Ren

In this paper, we leverage a key insight that retrieving sentences expressing a relation is a dual task of predicting relation label for a given sentence---two tasks are complementary to each other and can be optimized jointly for mutual enhancement.

MULTI-VIEW LEARNING Relation +3

Knowledge Graph Embedding with Hierarchical Relation Structure

no code implementations EMNLP 2018 Zhao Zhang, Fuzhen Zhuang, Meng Qu, Fen Lin, Qing He

To this end, in this paper, we extend existing KGE models TransE, TransH and DistMult, to learn knowledge representations by leveraging the information from the HRS.

Information Retrieval Knowledge Base Completion +4

An Attention-based Collaboration Framework for Multi-View Network Representation Learning

1 code implementation19 Sep 2017 Meng Qu, Jian Tang, Jingbo Shang, Xiang Ren, Ming Zhang, Jiawei Han

Existing approaches usually study networks with a single type of proximity between nodes, which defines a single view of a network.

Representation Learning

Automatic Synonym Discovery with Knowledge Bases

1 code implementation25 Jun 2017 Meng Qu, Xiang Ren, Jiawei Han

In this paper, we study the problem of automatic synonym discovery with knowledge bases, that is, identifying synonyms for knowledge base entities in a given domain-specific corpus.

Identity-sensitive Word Embedding through Heterogeneous Networks

no code implementations29 Nov 2016 Jian Tang, Meng Qu, Qiaozhu Mei

Based on an identity-labeled text corpora, a heterogeneous network of words and word identities is constructed to model different-levels of word co-occurrences.

Network Embedding text-classification +3

Meta-Path Guided Embedding for Similarity Search in Large-Scale Heterogeneous Information Networks

1 code implementation31 Oct 2016 Jingbo Shang, Meng Qu, Jialu Liu, Lance M. Kaplan, Jiawei Han, Jian Peng

It models vertices as low-dimensional vectors to explore network structure-embedded similarity.

CoType: Joint Extraction of Typed Entities and Relations with Knowledge Bases

2 code implementations27 Oct 2016 Xiang Ren, Zeqiu Wu, Wenqi He, Meng Qu, Clare R. Voss, Heng Ji, Tarek F. Abdelzaher, Jiawei Han

We propose a novel domain-independent framework, called CoType, that runs a data-driven text segmentation algorithm to extract entity mentions, and jointly embeds entity mentions, relation mentions, text features and type labels into two low-dimensional spaces (for entity and relation mentions respectively), where, in each space, objects whose types are close will also have similar representations.

Joint Entity and Relation Extraction Relation +1

Label Noise Reduction in Entity Typing by Heterogeneous Partial-Label Embedding

3 code implementations17 Feb 2016 Xiang Ren, Wenqi He, Meng Qu, Clare R. Voss, Heng Ji, Jiawei Han

Current systems of fine-grained entity typing use distant supervision in conjunction with existing knowledge bases to assign categories (type labels) to entity mentions.

Entity Typing Semantic Similarity +2

PTE: Predictive Text Embedding through Large-scale Heterogeneous Text Networks

1 code implementation2 Aug 2015 Jian Tang, Meng Qu, Qiaozhu Mei

One possible reason is that these text embedding methods learn the representation of text in a fully unsupervised way, without leveraging the labeled information available for the task.

Representation Learning

LINE: Large-scale Information Network Embedding

8 code implementations12 Mar 2015 Jian Tang, Meng Qu, Mingzhe Wang, Ming Zhang, Jun Yan, Qiaozhu Mei

This paper studies the problem of embedding very large information networks into low-dimensional vector spaces, which is useful in many tasks such as visualization, node classification, and link prediction.

Graph Embedding Link Prediction +2

Cannot find the paper you are looking for? You can Submit a new open access paper.