Search Results for author: Minghao Hu

Found 10 papers, 5 papers with code

Extract-Select: A Span Selection Framework for Nested Named Entity Recognition with Generative Adversarial Training

no code implementations Findings (ACL) 2022 Peixin Huang, Xiang Zhao, Minghao Hu, Yang Fang, Xinyi Li, Weidong Xiao

Secondly, we propose a hybrid selection strategy in the extractor, which not only makes full use of span boundary but also improves the ability of long entity recognition.

NER Nested Named Entity Recognition

ICLEA: Interactive Contrastive Learning for Self-supervised Entity Alignment

no code implementations17 Jan 2022 Kaisheng Zeng, Zhenhao Dong, Lei Hou, Yixin Cao, Minghao Hu, Jifan Yu, Xin Lv, Juanzi Li, Ling Feng

Self-supervised entity alignment (EA) aims to link equivalent entities across different knowledge graphs (KGs) without seed alignments.

Contrastive Learning Entity Alignment +1

Snapshot Ptychography on Array cameras

1 code implementation5 Nov 2021 Chengyu Wang, Minghao Hu, Yuzuru Takashima, Timothy J. Schulz, David J. Brady

We use convolutional neural networks to recover images optically down-sampled by $6. 7\times$ using coherent aperture synthesis over a 16 camera array.

Modeling Dense Cross-Modal Interactions for Joint Entity-Relation Extraction

no code implementations1 Jul 2020 Shan Zhao, Minghao Hu, Zhiping Cai, Fang Liu

The network is carefully constructed by stacking multiple attention units in depth to fully model dense interactions over token-label spaces, in which two basic attention units are proposed to explicitly capture fine-grained correlations across different modalities (e. g., token-to-token and labelto-token).

Joint Entity and Relation Extraction Relation Classification

A Multi-Type Multi-Span Network for Reading Comprehension that Requires Discrete Reasoning

1 code implementation IJCNLP 2019 Minghao Hu, Yuxing Peng, Zhen Huang, Dongsheng Li

Rapid progress has been made in the field of reading comprehension and question answering, where several systems have achieved human parity in some simplified settings.

Question Answering Reading Comprehension

Attention-Guided Answer Distillation for Machine Reading Comprehension

no code implementations EMNLP 2018 Minghao Hu, Yuxing Peng, Furu Wei, Zhen Huang, Dongsheng Li, Nan Yang, Ming Zhou

Despite that current reading comprehension systems have achieved significant advancements, their promising performances are often obtained at the cost of making an ensemble of numerous models.

Knowledge Distillation Machine Reading Comprehension

Reinforced Mnemonic Reader for Machine Reading Comprehension

3 code implementations8 May 2017 Minghao Hu, Yuxing Peng, Zhen Huang, Xipeng Qiu, Furu Wei, Ming Zhou

In this paper, we introduce the Reinforced Mnemonic Reader for machine reading comprehension tasks, which enhances previous attentive readers in two aspects.

Machine Reading Comprehension Question Answering +1

Cannot find the paper you are looking for? You can Submit a new open access paper.