no code implementations • 7 Oct 2024 • Fenia Christopoulou, Ronald Cardenas, Gerasimos Lampouras, Haitham Bou-Ammar, Jun Wang
Preference Optimization (PO) has proven an effective step for aligning language models to human-desired behaviors.
1 code implementation • 12 Jul 2024 • Zafeirios Fountas, Martin A Benfeghoul, Adnan Oomerjee, Fenia Christopoulou, Gerasimos Lampouras, Haitham Bou-Ammar, Jun Wang
Large language models (LLMs) have shown remarkable capabilities, but still struggle with processing extensive contexts, limiting their ability to maintain coherence and accuracy over long sequences.
no code implementations • 8 Feb 2024 • Fenia Christopoulou, Guchun Zhang, Gerasimos Lampouras
Large pre-trained language models have recently been expanded and applied to programming language tasks with great success, often through further pre-training of a strictly-natural language model--where training sequences typically contain both natural and (linearised) programming language.
no code implementations • 22 Oct 2022 • Chenxi Whitehouse, Fenia Christopoulou, Ignacio Iacobacci
We use Wikidata and English Wikipedia to construct an entity-centric CS corpus by switching entities to their counterparts in other languages.
1 code implementation • 22 Oct 2022 • Fenia Christopoulou, Gerasimos Lampouras, Ignacio Iacobacci
Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability.
Natural Language Understanding
Zero-Shot Cross-Lingual Transfer
1 code implementation • 22 Jul 2022 • Fenia Christopoulou, Gerasimos Lampouras, Milan Gritta, Guchun Zhang, Yinpeng Guo, Zhongqi Li, Qi Zhang, Meng Xiao, Bo Shen, Lin Li, Hao Yu, Li Yan, Pingyi Zhou, Xin Wang, Yuchi Ma, Ignacio Iacobacci, Yasheng Wang, Guangtai Liang, Jiansheng Wei, Xin Jiang, Qianxiang Wang, Qun Liu
We present PanGu-Coder, a pretrained decoder-only language model adopting the PanGu-Alpha architecture for text-to-code generation, i. e. the synthesis of programming language solutions given a natural language problem description.
no code implementations • NAACL 2021 • Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
We propose a multi-task, probabilistic approach to facilitate distantly supervised relation extraction by bringing closer the representations of sentences that contain the same Knowledge Base pairs.
Ranked #4 on
Relation Extraction
on NYT Corpus
1 code implementation • IJCNLP 2019 • Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
We thus propose an edge-oriented graph neural model for document-level relation extraction.
no code implementations • ACL 2019 • Sunil Kumar Sahu, Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
Inter-sentence relation extraction deals with a number of complex semantic relationships in documents, which require local, non-local, syntactic and semantic dependencies.
1 code implementation • ACL 2018 • Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou
We present a novel graph-based neural network model for relation extraction.
Ranked #1 on
Relation Extraction
on ACE 2005
(Cross Sentence metric)
no code implementations • SEMEVAL 2016 • Elisavet Palogiannidi, Athanasia Kolovou, Fenia Christopoulou, Filippos Kokkinos, Elias Iosif, Mal, Nikolaos rakis, Haris Papageorgiou, Shrikanth Narayanan, Alex Potamianos, ros