Search Results for author: Peiqin Lin

Found 8 papers, 5 papers with code

OFA: A Framework of Initializing Unseen Subword Embeddings for Efficient Large-scale Multilingual Continued Pretraining

1 code implementation15 Nov 2023 Yihong Liu, Peiqin Lin, Mingyang Wang, Hinrich Schütze

Instead of pretraining multilingual language models from scratch, a more efficient method is to adapt existing pretrained language models (PLMs) to new languages via vocabulary extension and continued pretraining.

Language Modelling Multilingual Word Embeddings

mPLM-Sim: Better Cross-Lingual Similarity and Transfer in Multilingual Pretrained Language Models

1 code implementation23 May 2023 Peiqin Lin, Chengzhi Hu, Zheyu Zhang, André F. T. Martins, Hinrich Schütze

Recent multilingual pretrained language models (mPLMs) have been shown to encode strong language-specific signals, which are not explicitly provided during pretraining.

Open-Ended Question Answering Zero-Shot Cross-Lingual Transfer

Modeling Content-Emotion Duality via Disentanglement for Empathetic Conversation

1 code implementation26 Sep 2022 Peiqin Lin, Jiashuo Wang, Hinrich Schütze, Wenjie Li

To solve the task, it is essential to model the content-emotion duality of a dialogue, which is composed of the content view (i. e., what personal experiences are described) and the emotion view (i. e., the feelings of the speaker on these experiences).

Disentanglement Empathetic Response Generation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.