Abusive Language
50 papers with code • 0 benchmarks • 9 datasets
Benchmarks
These leaderboards are used to track progress in Abusive Language
Datasets
Most implemented papers
Racial Bias in Hate Speech and Abusive Language Detection Datasets
Technologies for abusive language detection are being developed and applied with little consideration of their potential biases.
Comparative Studies of Detecting Abusive Language on Twitter
However, this dataset has not been comprehensively studied to its potential.
Understanding Abuse: A Typology of Abusive Language Detection Subtasks
As the body of research on abusive language detection and analysis grows, there is a need for critical consideration of the relationships between different subtasks that have been grouped under this label.
One-step and Two-step Classification for Abusive Language Detection on Twitter
Automatic abusive language detection is a difficult but important task for online social media.
Emo2Vec: Learning Generalized Emotion Representation by Multi-task Training
In this paper, we propose Emo2Vec which encodes emotional semantics into vectors.
Sequence Classification with Human Attention
Learning attention functions requires large volumes of data, but many NLP tasks simulate human behavior, and in this paper, we show that human attention really does provide a good inductive bias on many attention functions in NLP.
Stop PropagHate at SemEval-2019 Tasks 5 and 6: Are abusive language classification results reproducible?
This paper summarizes the participation of Stop PropagHate team at SemEval 2019.
Multi-label Hate Speech and Abusive Language Detection in Indonesian Twitter
Hate speech and abusive language spreading on social media need to be detected automatically to avoid conflict between citizen.
The Discourse of Online Content Moderation: Investigating Polarized User Responses to Changes in Reddit's Quarantine Policy
Recent concerns over abusive behavior on their platforms have pressured social media companies to strengthen their content moderation policies.