Search Results for author: Junru Zhou

Found 12 papers, 6 papers with code

Distance-Restricted Folklore Weisfeiler-Leman GNNs with Provable Cycle Counting Power

1 code implementation NeurIPS 2023 Junru Zhou, Jiarui Feng, Xiyuan Wang, Muhan Zhang

Many of the proposed GNN models with provable cycle counting power are based on subgraph GNNs, i. e., extracting a bag of subgraphs from the input graph, generating representations for each subgraph, and using them to augment the representation of the input graph.

Efficiently Counting Substructures by Subgraph GNNs without Running GNN on Subgraphs

1 code implementation19 Mar 2023 Zuoyu Yan, Junru Zhou, Liangcai Gao, Zhi Tang, Muhan Zhang

Among these works, a popular way is to use subgraph GNNs, which decompose the input graph into a collection of subgraphs and enhance the representation of the graph by applying GNN to individual subgraphs.

Graph Learning

Head-driven Phrase Structure Parsing in O($n^3$) Time Complexity

no code implementations20 May 2021 Zuchao Li, Junru Zhou, Hai Zhao, Kevin Parnow

Constituent and dependency parsing, the two classic forms of syntactic parsing, have been found to benefit from joint training and decoding under a uniform formalism, Head-driven Phrase Structure Grammar (HPSG).

Dependency Parsing

SG-Net: Syntax Guided Transformer for Language Representation

no code implementations27 Dec 2020 Zhuosheng Zhang, Yuwei Wu, Junru Zhou, Sufeng Duan, Hai Zhao, Rui Wang

In detail, for self-attention network (SAN) sponsored Transformer-based encoder, we introduce syntactic dependency of interest (SDOI) design into the SAN to form an SDOI-SAN with syntax-guided self-attention.

Machine Reading Comprehension Machine Translation +2

LIMIT-BERT : Linguistics Informed Multi-Task BERT

1 code implementation Findings of the Association for Computational Linguistics 2020 Junru Zhou, Zhuosheng Zhang, Hai Zhao, Shuailiang Zhang

Besides, LIMIT-BERT takes a semi-supervised learning strategy to offer the same large amount of linguistics task data as that for the language model training.

Language Modelling Multi-Task Learning +3

Semantics-Aware Inferential Network for Natural Language Understanding

no code implementations28 Apr 2020 Shuailiang Zhang, Hai Zhao, Junru Zhou

Taking explicit contextualized semantics as a complementary input, the inferential module of SAIN enables a series of reasoning steps over semantic clues through an attention mechanism.

Machine Reading Comprehension Natural Language Inference +1

Dependency and Span, Cross-Style Semantic Role Labeling on PropBank and NomBank

no code implementations7 Nov 2019 Zuchao Li, Hai Zhao, Junru Zhou, Kevin Parnow, Shexia He

In this paper, we define a new cross-style semantic role label convention and propose a new cross-style joint optimization model designed around the most basic linguistic meaning of a semantic role, providing a solution to make the results of the two styles more comparable and allowing both formalisms of SRL to benefit from their natural connections in both linguistics and computation.

Semantic Role Labeling

LIMIT-BERT : Linguistic Informed Multi-Task BERT

no code implementations31 Oct 2019 Junru Zhou, Zhuosheng Zhang, Hai Zhao, Shuailiang Zhang

In this paper, we present a Linguistic Informed Multi-Task BERT (LIMIT-BERT) for learning language representations across multiple linguistic tasks by Multi-Task Learning (MTL).

Multi-Task Learning POS +2

Parsing All: Syntax and Semantics, Dependencies and Spans

1 code implementation Findings of the Association for Computational Linguistics 2020 Junru Zhou, Zuchao Li, Hai Zhao

Both syntactic and semantic structures are key linguistic contextual clues, in which parsing the latter has been well shown beneficial from parsing the former.

Semantic Parsing

Concurrent Parsing of Constituency and Dependency

no code implementations18 Aug 2019 Junru Zhou, Shuailiang Zhang, Hai Zhao

Constituent and dependency representation for syntactic structure share a lot of linguistic and computational characteristics, this paper thus makes the first attempt by introducing a new model that is capable of parsing constituent and dependency at the same time, so that lets either of the parsers enhance each other.

Dependency Parsing

SG-Net: Syntax-Guided Machine Reading Comprehension

1 code implementation14 Aug 2019 Zhuosheng Zhang, Yuwei Wu, Junru Zhou, Sufeng Duan, Hai Zhao, Rui Wang

In detail, for self-attention network (SAN) sponsored Transformer-based encoder, we introduce syntactic dependency of interest (SDOI) design into the SAN to form an SDOI-SAN with syntax-guided self-attention.

Language Modelling Machine Reading Comprehension +1

Cannot find the paper you are looking for? You can Submit a new open access paper.