no code implementations • ROCLING 2022 • Xiang Luo, Jin Wang, Xuejie Zhang
This paper adopts a transformer-based model with focal Loss and regularization dropout.
1 code implementation • COLING 2022 • Xinge Ma, Jin Wang, Liang-Chih Yu, Xuejie Zhang
The teacher can continuously meta-learn the student’s learning objective to adjust its parameters for maximizing the student’s performance throughout the distillation process.
no code implementations • COLING 2022 • Jun Kong, Jin Wang, Liang-Chih Yu, Xuejie Zhang
To address this limitation, a unified horizontal and vertical multi-perspective early exiting (MPEE) framework is proposed in this study to accelerate the inference of transformer-based models.
no code implementations • SemEval (NAACL) 2022 • Zihan Nai, Jin Wang, Xuejie Zhang
This paper describes the system submitted by our team (YNU-HPCC) to SemEval-2022 Task 8: Multilingual news article similarity.
no code implementations • SemEval (NAACL) 2022 • Guangmin Zheng, Jin Wang, Xuejie Zhang
As participants in Task 6 (titled “iSarcasmEval: Intended Sarcasm Detection In English and Arabic”), we implement the sentiment system for all three subtasks in English and Arabic.
no code implementations • SemEval (NAACL) 2022 • Chao Han, Jin Wang, Xuejie Zhang
We use Faster-RCNN to extract visual representation and utilize LXMERT’s cross-attention for multi-modal alignment.
no code implementations • SemEval (NAACL) 2022 • Wenqiang Bai, Jin Wang, Xuejie Zhang
For the multi-label classification, we use the fine-tuned BERT model to extract the sentiment score of the text and a fully connected layer to classify the text into the PCL categories.
no code implementations • SemEval (NAACL) 2022 • Kuanghong Liu, Jin Wang, Xuejie Zhang
However, for subtask A of idiomaticity detection, we simply did a few explorations and experiments based on the xlm-RoBERTa model.
1 code implementation • 12 Jan 2023 • Ruijun Chen, Jin Wang, Liang-Chih Yu, Xuejie Zhang
Both memories collaborate to obtain entailment and discourse representation for the generation, allowing a deeper understanding of both consistency and coherence.
Ranked #1 on
Dialogue Generation
on Persona-Chat
no code implementations • SEMEVAL 2021 • Zhewen Yu, Jin Wang, Xuejie Zhang
To address the issue, the organizers provided the models that fine-tuned a large number of source domain data on pre-trained models and the dev data for participants.
no code implementations • SEMEVAL 2021 • Xingyu Zhu, Jin Wang, Xuejie Zhang
In recent years, memes combining image and text have been widely used in social media, and memes are one of the most popular types of content used in online disinformation campaigns.
1 code implementation • SEMEVAL 2021 • Ruijun Chen, Jin Wang, Xuejie Zhang
In this paper, a transformer-based model with auxiliary information is proposed for SemEval-2021 Task 5.
no code implementations • SEMEVAL 2021 • Xinge Ma, Jin Wang, Xuejie Zhang
This paper describes the system we built as the YNU-HPCC team in the SemEval-2021 Task 11: NLPContributionGraph.
1 code implementation • Asian Chapter of the Association for Computational Linguistics 2020 • Li Yuan, Jin Wang, Liang-Chih Yu, Xuejie Zhang
Recent studies used attention-based methods that can effectively improve the performance of aspect-level sentiment analysis.
no code implementations • SEMEVAL 2020 • Dawei Liao, Jin Wang, Xuejie Zhang
In this study, we propose a multi-granularity ordinal classification method to address the problem of emphasis selection.
no code implementations • SEMEVAL 2020 • Joseph Tomasulo, Jin Wang, Xuejie Zhang
This paper describes an ensemble model designed for Semeval-2020 Task 7.
1 code implementation • SEMEVAL 2020 • Jun Kong, Jin Wang, Xuejie Zhang
In this paper, we (my codalab username is kongjun) present a system that uses a bilingual vector gating mechanism for bilingual resources to complete the task.