Search Results for author: Jiangnan Xia

Found 7 papers, 1 papers with code

Bayes-enhanced Multi-view Attention Networks for Robust POI Recommendation

no code implementations1 Nov 2023 Jiangnan Xia, Yu Yang, Senzhang Wang, Hongzhi Yin, Jiannong Cao, Philip S. Yu

To this end, we investigate a novel problem of robust POI recommendation by considering the uncertainty factors of the user check-ins, and proposes a Bayes-enhanced Multi-view Attention Network.

Data Augmentation Representation Learning

What Matters in Training a GPT4-Style Language Model with Multimodal Inputs?

2 code implementations5 Jul 2023 Yan Zeng, Hanbo Zhang, Jiani Zheng, Jiangnan Xia, Guoqiang Wei, Yang Wei, Yuchen Zhang, Tao Kong

However, the performance of these models heavily relies on design choices such as network structures, training data, and training strategies, and these choices have not been extensively discussed in the literature, making it difficult to quantify progress in this field.

Instruction Following Language Modelling

Incorporating External Knowledge into Machine Reading for Generative Question Answering

no code implementations IJCNLP 2019 Bin Bi, Chen Wu, Ming Yan, Wei Wang, Jiangnan Xia, Chenliang Li

Different from existing work on knowledge-aware QA, we focus on a more challenging task of leveraging external knowledge to generate answers in natural language for a given question with context.

Answer Generation Generative Question Answering +1

StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding

no code implementations ICLR 2020 Wei Wang, Bin Bi, Ming Yan, Chen Wu, Zuyi Bao, Jiangnan Xia, Liwei Peng, Luo Si

Recently, the pre-trained language model, BERT (and its robustly optimized version RoBERTa), has attracted a lot of attention in natural language understanding (NLU), and achieved state-of-the-art accuracy in various NLU tasks, such as sentiment classification, natural language inference, semantic textual similarity and question answering.

Language Modelling Linguistic Acceptability +7

Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning

no code implementations13 Aug 2019 Jiangnan Xia, Chen Wu, Ming Yan

This paper focuses on how to take advantage of external relational knowledge to improve machine reading comprehension (MRC) with multi-task learning.

Language Modelling Machine Reading Comprehension +2

A Deep Cascade Model for Multi-Document Reading Comprehension

no code implementations28 Nov 2018 Ming Yan, Jiangnan Xia, Chen Wu, Bin Bi, Zhongzhou Zhao, Ji Zhang, Luo Si, Rui Wang, Wei Wang, Haiqing Chen

To address this problem, we develop a novel deep cascade learning model, which progressively evolves from the document-level and paragraph-level ranking of candidate texts to more precise answer extraction with machine reading comprehension.

Machine Reading Comprehension Question Answering +2

Jiangnan at SemEval-2018 Task 11: Deep Neural Network with Attention Method for Machine Comprehension Task

no code implementations SEMEVAL 2018 Jiangnan Xia

This paper describes our submission for the International Workshop on Semantic Evaluation (SemEval-2018) shared task 11{--} Machine Comprehension using Commonsense Knowledge (Ostermann et al., 2018b).

Machine Reading Comprehension Named Entity Recognition (NER) +1

Cannot find the paper you are looking for? You can Submit a new open access paper.