Search Results for author: Nan Hu

Found 9 papers, 3 papers with code

Dual-Channel Evidence Fusion for Fact Verification over Texts and Tables

no code implementations NAACL 2022 Nan Hu, Zirui Wu, Yuxuan Lai, Xiao Liu, Yansong Feng

Different from previous fact extraction and verification tasks that only consider evidence of a single format, FEVEROUS brings further challenges by extending the evidence format to both plain text and tables.

Fact Verification

An Empirical Study of Pre-trained Language Models in Simple Knowledge Graph Question Answering

1 code implementation18 Mar 2023 Nan Hu, Yike Wu, Guilin Qi, Dehai Min, Jiaoyan Chen, Jeff Z. Pan, Zafar Ali

Large-scale pre-trained language models (PLMs) such as BERT have recently achieved great success and become a milestone in natural language processing (NLP).

Graph Question Answering Knowledge Distillation +1

Evaluation of ChatGPT as a Question Answering System for Answering Complex Questions

no code implementations14 Mar 2023 Yiming Tan, Dehai Min, Yu Li, Wenbo Li, Nan Hu, Yongrui Chen, Guilin Qi

As ChatGPT covers resources such as Wikipedia and supports natural language question answering, it has garnered attention as a potential replacement for traditional knowledge based question answering (KBQA) models.

Language Modelling Natural Language Understanding +2

HiCLRE: A Hierarchical Contrastive Learning Framework for Distantly Supervised Relation Extraction

1 code implementation Findings (ACL) 2022 Dongyang Li, Taolin Zhang, Nan Hu, Chengyu Wang, Xiaofeng He

In this paper, we propose a hierarchical contrastive learning Framework for Distantly Supervised relation extraction (HiCLRE) to reduce noisy sentences, which integrate the global structural information and local fine-grained interaction.

Contrastive Learning Data Augmentation +1

DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding

1 code implementation2 Dec 2021 Taolin Zhang, Chengyu Wang, Nan Hu, Minghui Qiu, Chengguang Tang, Xiaofeng He, Jun Huang

Knowledge-Enhanced Pre-trained Language Models (KEPLMs) are pre-trained models with relation triples injecting from knowledge graphs to improve language understanding abilities.

Knowledge Graphs Knowledge Probing +3

Benchmarking off-the-shelf statistical shape modeling tools in clinical applications

no code implementations7 Sep 2020 Anupama Goparaju, Alexandre Bone, Nan Hu, Heath B. Henninger, Andrew E. Anderson, Stanley Durrleman, Matthijs Jacxsens, Alan Morris, Ibolya Csecs, Nassir Marrouche, Shireen Y. Elhabian

Statistical shape modeling (SSM) is widely used in biology and medicine as a new generation of morphometric approaches for the quantitative analysis of anatomical shapes.

Benchmarking

Graph Matching with Anchor Nodes: A Learning Approach

no code implementations CVPR 2013 Nan Hu, Raif M. Rustamov, Leonidas Guibas

In this paper, we consider the weighted graph matching problem with partially disclosed correspondences between a number of anchor nodes.

Graph Matching

Distributable Consistent Multi-Object Matching

no code implementations CVPR 2018 Nan Hu, Qi-Xing Huang, Boris Thibert, Leonidas Guibas

In this paper we propose an optimization-based framework to multiple object matching.

Stable and Informative Spectral Signatures for Graph Matching

no code implementations CVPR 2014 Nan Hu, Raif M. Rustamov, Leonidas Guibas

We also introduce the pairwise heat kernel distance as a stable second order compatibility term; we justify its plausibility by showing that in a certain limiting case it converges to the classical adjacency matrix-based second order compatibility function.

Graph Matching Informativeness

Cannot find the paper you are looking for? You can Submit a new open access paper.