Search Results for author: Xuejie Zhang

Found 19 papers, 8 papers with code

Knowledge Distillation with Reptile Meta-Learning for Pretrained Language Model Compression

1 code implementation COLING 2022 Xinge Ma, Jin Wang, Liang-Chih Yu, Xuejie Zhang

The teacher can continuously meta-learn the student’s learning objective to adjust its parameters for maximizing the student’s performance throughout the distillation process.

Knowledge Distillation Language Modelling +3

YNU-HPCC at SemEval-2022 Task 6: Transformer-based Model for Intended Sarcasm Detection in English and Arabic

no code implementations SemEval (NAACL) 2022 Guangmin Zheng, Jin Wang, Xuejie Zhang

As participants in Task 6 (titled “iSarcasmEval: Intended Sarcasm Detection In English and Arabic”), we implement the sentiment system for all three subtasks in English and Arabic.

Binary Classification Classification +3

Accelerating Inference for Pretrained Language Models by Unified Multi-Perspective Early Exiting

no code implementations COLING 2022 Jun Kong, Jin Wang, Liang-Chih Yu, Xuejie Zhang

To address this limitation, a unified horizontal and vertical multi-perspective early exiting (MPEE) framework is proposed in this study to accelerate the inference of transformer-based models.

YNU-HPCC at SemEval-2022 Task 4: Finetuning Pretrained Language Models for Patronizing and Condescending Language Detection

no code implementations SemEval (NAACL) 2022 Wenqiang Bai, Jin Wang, Xuejie Zhang

For the multi-label classification, we use the fine-tuned BERT model to extract the sentiment score of the text and a fully connected layer to classify the text into the PCL categories.

Classification Multi-Label Classification +2

YNU-HPCC at SemEval-2022 Task 8: Transformer-based Ensemble Model for Multilingual News Article Similarity

no code implementations SemEval (NAACL) 2022 Zihan Nai, Jin Wang, Xuejie Zhang

This paper describes the system submitted by our team (YNU-HPCC) to SemEval-2022 Task 8: Multilingual news article similarity.

Personalized LoRA for Human-Centered Text Understanding

1 code implementation10 Mar 2024 You Zhang, Jin Wang, Liang-Chih Yu, Dan Xu, Xuejie Zhang

Effectively and efficiently adapting a pre-trained language model (PLM) for human-centered text understanding (HCTU) is challenging since user tokens are million-level in most personalized applications and do not have concrete explicit semantics.

Language Modelling Zero-Shot Learning

Learning to Memorize Entailment and Discourse Relations for Persona-Consistent Dialogues

1 code implementation12 Jan 2023 Ruijun Chen, Jin Wang, Liang-Chih Yu, Xuejie Zhang

Both memories collaborate to obtain entailment and discourse representation for the generation, allowing a deeper understanding of both consistency and coherence.

 Ranked #1 on Dialogue Generation on Persona-Chat (using extra training data)

Dialogue Generation

YNU-HPCC at SemEval-2021 Task 6: Combining ALBERT and Text-CNN for Persuasion Detection in Texts and Images

no code implementations SEMEVAL 2021 Xingyu Zhu, Jin Wang, Xuejie Zhang

In recent years, memes combining image and text have been widely used in social media, and memes are one of the most popular types of content used in online disinformation campaigns.

Meme Classification text-classification +1

YNU-HPCC at SemEval-2021 Task 10: Using a Transformer-based Source-Free Domain Adaptation Model for Semantic Processing

no code implementations SEMEVAL 2021 Zhewen Yu, Jin Wang, Xuejie Zhang

To address the issue, the organizers provided the models that fine-tuned a large number of source domain data on pre-trained models and the dev data for participants.

NER Source-Free Domain Adaptation

HPCC-YNU at SemEval-2020 Task 9: A Bilingual Vector Gating Mechanism for Sentiment Analysis of Code-Mixed Text

1 code implementation SEMEVAL 2020 Jun Kong, Jin Wang, Xuejie Zhang

In this paper, we (my codalab username is kongjun) present a system that uses a bilingual vector gating mechanism for bilingual resources to complete the task.

Sentiment Analysis

Cannot find the paper you are looking for? You can Submit a new open access paper.