Search Results for author: Andrew Arnold

Found 10 papers, 4 papers with code

H2KGAT: Hierarchical Hyperbolic Knowledge Graph Attention Network

no code implementations EMNLP 2020 Shen Wang, Xiaokai Wei, Cicero Nogueira dos santos, Zhiguo Wang, Ramesh Nallapati, Andrew Arnold, Bing Xiang, Philip S. Yu

Existing knowledge graph embedding approaches concentrate on modeling symmetry/asymmetry, inversion, and composition typed relations but overlook the hierarchical nature of relations.

Graph Attention Knowledge Graph Embedding +2

Debiasing Neural Retrieval via In-batch Balancing Regularization

no code implementations NAACL (GeBNLP) 2022 Yuantong Li, Xiaokai Wei, Zijian Wang, Shen Wang, Parminder Bhatia, Xiaofei Ma, Andrew Arnold

People frequently interact with information retrieval (IR) systems, however, IR models exhibit biases and discrimination towards various demographics.

Fairness Passage Retrieval +1

DQ-BART: Efficient Sequence-to-Sequence Model via Joint Distillation and Quantization

2 code implementations ACL 2022 Zheng Li, Zijian Wang, Ming Tan, Ramesh Nallapati, Parminder Bhatia, Andrew Arnold, Bing Xiang, Dan Roth

Empirical analyses show that, despite the challenging nature of generative tasks, we were able to achieve a 16. 5x model footprint compression ratio with little performance drop relative to the full-precision counterparts on multiple summarization and QA datasets.

Knowledge Distillation Model Compression +2

QaNER: Prompting Question Answering Models for Few-shot Named Entity Recognition

1 code implementation3 Mar 2022 Andy T. Liu, Wei Xiao, Henghui Zhu, Dejiao Zhang, Shang-Wen Li, Andrew Arnold

Recently, prompt-based learning for pre-trained language models has succeeded in few-shot Named Entity Recognition (NER) by exploiting prompts as task guidance to increase label efficiency.

Few-shot NER Named Entity Recognition +2

Knowledge Enhanced Pretrained Language Models: A Compreshensive Survey

no code implementations16 Oct 2021 Xiaokai Wei, Shen Wang, Dejiao Zhang, Parminder Bhatia, Andrew Arnold

This new paradigm has revolutionized the entire field of natural language processing, and set the new state-of-the-art performance for a wide variety of NLP tasks.

Uncertainty-Based Adaptive Learning for Reading Comprehension

no code implementations1 Jan 2021 Jing Wang, Jie Shen, Xiaofei Ma, Andrew Arnold

Recent years have witnessed a surge of successful applications of machine reading comprehension.

Machine Reading Comprehension

Neural document expansion for ad-hoc information retrieval

no code implementations27 Dec 2020 Cheng Tang, Andrew Arnold

Recently, Nogueira et al. [2019] proposed a new approach to document expansion based on a neural Seq2Seq model, showing significant improvement on short text retrieval task.

Ad-Hoc Information Retrieval Information Retrieval +2

Cannot find the paper you are looking for? You can Submit a new open access paper.