Multi-Task Bidirectional Transformer Representations for Irony Detection

8 Sep 2019Chiyu ZhangMuhammad Abdul-Mageed

Supervised deep learning requires large amounts of training data. In the context of the FIRE2019 Arabic irony detection shared task ([email protected]), we show how we mitigate this need by fine-tuning the pre-trained bidirectional encoders from transformers (BERT) on gold data in a multi-task setting... (read more)

PDF Abstract


No code implementations yet. Submit your code now

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods used in the Paper