General Classification
3929 papers with code • 11 benchmarks • 8 datasets
Algorithms trying to solve the general task of classification.
Benchmarks
These leaderboards are used to track progress in General Classification
Libraries
Use these libraries to find General Classification models and implementationsMost implemented papers
Universal Language Model Fine-tuning for Text Classification
Inductive transfer learning has greatly impacted computer vision, but existing approaches in NLP still require task-specific modifications and training from scratch.
Bag of Tricks for Efficient Text Classification
This paper explores a simple and efficient baseline for text classification.
Aggregated Residual Transformations for Deep Neural Networks
Our simple design results in a homogeneous, multi-branch architecture that has only a few hyper-parameters to set.
DARTS: Differentiable Architecture Search
This paper addresses the scalability challenge of architecture search by formulating the task in a differentiable manner.
A Structured Self-attentive Sentence Embedding
This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention.
Semi-Supervised Classification with Graph Convolutional Networks
We present a scalable approach for semi-supervised learning on graph-structured data that is based on an efficient variant of convolutional neural networks which operate directly on graphs.
An Implementation of Faster RCNN with Study for Region Sampling
We adapted the join-training scheme of Faster RCNN framework from Caffe to TensorFlow as a baseline implementation for object detection.
FastText.zip: Compressing text classification models
We consider the problem of producing compact architectures for text classification, such that the full model fits in a limited amount of memory.
Prototypical Networks for Few-shot Learning
We propose prototypical networks for the problem of few-shot classification, where a classifier must generalize to new classes not seen in the training set, given only a small number of examples of each new class.
Weight Uncertainty in Neural Networks
We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop.