no code implementations • 31 Mar 2025 • Tongke Ni, Yang Fan, Junru Zhou, XiangPing Wu, Qingcai Chen
Text semantic segmentation involves partitioning a document into multiple paragraphs with continuous semantics based on the subject matter, contextual information, and document structure.
no code implementations • 13 Oct 2024 • Junru Zhou, Cai Zhou, Xiyuan Wang, Pan Li, Muhan Zhang
Graph neural networks (GNNs) have achieved remarkable success in a variety of machine learning tasks over graph data.
1 code implementation • 4 Oct 2024 • Junru Zhou, Muhan Zhang
The ability of graph neural networks (GNNs) to count homomorphisms has recently been proposed as a practical and fine-grained measure of their expressive power.
1 code implementation • NeurIPS 2023 • Junru Zhou, Jiarui Feng, Xiyuan Wang, Muhan Zhang
Many of the proposed GNN models with provable cycle counting power are based on subgraph GNNs, i. e., extracting a bag of subgraphs from the input graph, generating representations for each subgraph, and using them to augment the representation of the input graph.
1 code implementation • 19 Mar 2023 • Zuoyu Yan, Junru Zhou, Liangcai Gao, Zhi Tang, Muhan Zhang
We investigate the enhancement of graph neural networks' (GNNs) representation power through their ability in substructure counting.
no code implementations • 20 May 2021 • Zuchao Li, Junru Zhou, Hai Zhao, Kevin Parnow
Constituent and dependency parsing, the two classic forms of syntactic parsing, have been found to benefit from joint training and decoding under a uniform formalism, Head-driven Phrase Structure Grammar (HPSG).
no code implementations • 27 Dec 2020 • Zhuosheng Zhang, Yuwei Wu, Junru Zhou, Sufeng Duan, Hai Zhao, Rui Wang
In detail, for self-attention network (SAN) sponsored Transformer-based encoder, we introduce syntactic dependency of interest (SDOI) design into the SAN to form an SDOI-SAN with syntax-guided self-attention.
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Junru Zhou, Zhuosheng Zhang, Hai Zhao, Shuailiang Zhang
Besides, LIMIT-BERT takes a semi-supervised learning strategy to offer the same large amount of linguistics task data as that for the language model training.
no code implementations • 28 Apr 2020 • Shuailiang Zhang, Hai Zhao, Junru Zhou
Taking explicit contextualized semantics as a complementary input, the inferential module of SAIN enables a series of reasoning steps over semantic clues through an attention mechanism.
1 code implementation • 7 Nov 2019 • Zuchao Li, Hai Zhao, Junru Zhou, Kevin Parnow, Shexia He
In this paper, we define a new cross-style semantic role label convention and propose a new cross-style joint optimization model designed around the most basic linguistic meaning of a semantic role, providing a solution to make the results of the two styles more comparable and allowing both formalisms of SRL to benefit from their natural connections in both linguistics and computation.
no code implementations • 31 Oct 2019 • Junru Zhou, Zhuosheng Zhang, Hai Zhao, Shuailiang Zhang
In this paper, we present a Linguistic Informed Multi-Task BERT (LIMIT-BERT) for learning language representations across multiple linguistic tasks by Multi-Task Learning (MTL).
1 code implementation • Findings of the Association for Computational Linguistics 2020 • Junru Zhou, Zuchao Li, Hai Zhao
Both syntactic and semantic structures are key linguistic contextual clues, in which parsing the latter has been well shown beneficial from parsing the former.
no code implementations • 18 Aug 2019 • Junru Zhou, Shuailiang Zhang, Hai Zhao
Constituent and dependency representation for syntactic structure share a lot of linguistic and computational characteristics, this paper thus makes the first attempt by introducing a new model that is capable of parsing constituent and dependency at the same time, so that lets either of the parsers enhance each other.
1 code implementation • 14 Aug 2019 • Zhuosheng Zhang, Yuwei Wu, Junru Zhou, Sufeng Duan, Hai Zhao, Rui Wang
In detail, for self-attention network (SAN) sponsored Transformer-based encoder, we introduce syntactic dependency of interest (SDOI) design into the SAN to form an SDOI-SAN with syntax-guided self-attention.
Ranked #5 on
Question Answering
on SQuAD2.0 dev
1 code implementation • ACL 2019 • Junru Zhou, Hai Zhao
In details, we report 96. 33 F1 of constituent parsing and 97. 20\% UAS of dependency parsing on PTB.
Ranked #5 on
Constituency Parsing
on Penn Treebank