no code implementations • LREC 2022 • Michael Strobl, Amine Trabelsi, Osmar Zaïane
WEXEA is a tool to exhaustively annotate entities in the English Wikipedia.
no code implementations • 17 Apr 2024 • Mohammad Khosravani, Chenyang Huang, Amine Trabelsi
This paper introduces a novel extractive approach for key point generation, that outperforms previous state-of-the-art methods for the task.
no code implementations • 18 May 2023 • Mohammad Khosravani, Amine Trabelsi
This survey covers different recent techniques and models used for unsupervised summarization.
no code implementations • 19 Apr 2022 • Michael Strobl, Amine Trabelsi, Osmar Zaiane
The most common Named Entity Recognizers are usually sequence taggers trained on fully annotated corpora, i. e. the class of all words for all entities is known.
1 code implementation • 14 Apr 2022 • Michael Strobl, Amine Trabelsi, Osmar Zaiane
To effectively train accurate Relation Extraction models, sufficient and properly labeled data is required.
no code implementations • NAACL 2021 • Chenyang Huang, Amine Trabelsi, Xuebin Qin, Nawshad Farruque, Lili Mou, Osmar Za{\"\i}ane
Multi-label emotion classification is an important task in NLP and is essential to many applications.
1 code implementation • SEMEVAL 2020 • Anandh Perumal, Chenyang Huang, Amine Trabelsi, Osmar R. Zaïane
In order to generate more meaningful explanations, we propose UNION, a unified end-to-end framework, to utilize several existing commonsense datasets so that it allows a model to learn more dynamics under the scope of commonsense reasoning.
no code implementations • LREC 2020 • Michael Strobl, Amine Trabelsi, Osmar Zaiane
Building predictive models for information extraction from text, such as named entity recognition or the extraction of semantic relationships between named entities in text, requires a large corpus of annotated text.
no code implementations • 6 Nov 2019 • Chenyang Huang, Amine Trabelsi, Xuebin Qin, Nawshad Farruque, Osmar R. Zaïane
Most of the existing methods treat this task as a problem of single-label multi-class text classification.
no code implementations • RANLP 2019 • Mansour Saffar Mehrjardi, Amine Trabelsi, Osmar R. Zaiane
Self-attentional models are a new paradigm for sequence modelling tasks which differ from common sequence modelling methods, such as recurrence-based and convolution-based sequence learning, in the way that their architecture is only based on the attention mechanism.
no code implementations • 1 Aug 2019 • Amine Trabelsi, Osmar R. Zaiane
This work tackles the problem of unsupervised modeling and extraction of the main contrastive sentential reasons conveyed by divergent viewpoints on polarized issues.
1 code implementation • SEMEVAL 2019 • Chenyang Huang, Amine Trabelsi, Osmar R. Zaïane
This paper describes the system submitted by ANA Team for the SemEval-2019 Task 3: EmoContext.
Ranked #3 on Emotion Recognition in Conversation on EC
Emotion Recognition in Conversation General Classification +2
1 code implementation • NAACL 2018 • Chenyang Huang, Osmar Za{\"\i}ane, Amine Trabelsi, Nouha Dziri
Despite myriad efforts in the literature designing neural dialogue generation systems in recent years, very few consider putting restrictions on the response itself.