no code implementations • 24 Jan 2024 • Peiqin Lin, Shaoxiong Ji, Jörg Tiedemann, André F. T. Martins, Hinrich Schütze
Large language models have advanced the state of the art in natural language processing.
1 code implementation • arXiv 2023 • Stephen Mayhew, Terra Blevins, Shuheng Liu, Marek Šuppa, Hila Gonen, Joseph Marvin Imperial, Börje F. Karlsson, Peiqin Lin, Nikola Ljubešić, LJ Miranda, Barbara Plank, Arij Riabi, Yuval Pinter
We introduce Universal NER (UNER), an open, community-driven project to develop gold-standard NER benchmarks in many languages.
Ranked #1 on Named Entity Recognition (NER) on UNER v1 (Danish)
1 code implementation • 15 Nov 2023 • Yihong Liu, Peiqin Lin, Mingyang Wang, Hinrich Schütze
Instead of pretraining multilingual language models from scratch, a more efficient method is to adapt existing pretrained language models (PLMs) to new languages via vocabulary extension and continued pretraining.
1 code implementation • 23 May 2023 • Peiqin Lin, Chengzhi Hu, Zheyu Zhang, André F. T. Martins, Hinrich Schütze
Recent multilingual pretrained language models (mPLMs) have been shown to encode strong language-specific signals, which are not explicitly provided during pretraining.
Open-Ended Question Answering Zero-Shot Cross-Lingual Transfer
1 code implementation • 20 May 2023 • Ayyoob Imani, Peiqin Lin, Amir Hossein Kargaran, Silvia Severini, Masoud Jalili Sabet, Nora Kassner, Chunlan Ma, Helmut Schmid, André F. T. Martins, François Yvon, Hinrich Schütze
The NLP community has mainly focused on scaling Large Language Models (LLMs) vertically, i. e., making them better for about 100 languages.
1 code implementation • 26 Sep 2022 • Peiqin Lin, Jiashuo Wang, Hinrich Schütze, Wenjie Li
To solve the task, it is essential to model the content-emotion duality of a dialogue, which is composed of the content view (i. e., what personal experiences are described) and the emotion view (i. e., the feelings of the speaker on these experiences).
no code implementations • 9 Oct 2021 • Jiashuo Wang, Wenjie Li, Peiqin Lin, Feiteng Mu
Empathetic response generation aims to comprehend the user emotion and then respond to it appropriately.
no code implementations • Findings of the Association for Computational Linguistics 2020 • Peiqin Lin, Meng Yang
However, the relation between the target extraction task and the target classification task has not been well exploited.