About

Multi-task learning aims to learn multiple different tasks simultaneously while maximizing performance on one or all of the tasks.

( Image credit: Cross-stitch Networks for Multi-task Learning )

Benchmarks

TREND DATASET BEST METHOD PAPER TITLE PAPER CODE COMPARE

Datasets

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION PART-OF-SPEECH TAGGING UNSUPERVISED REPRESENTATION LEARNING

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

13 Mar 2017tensorflow/models

In this work, we present a compact, modular framework for constructing novel recurrent neural architectures.

DEPENDENCY PARSING MULTI-TASK LEARNING

fairseq S2T: Fast Speech-to-Text Modeling with fairseq

11 Oct 2020huggingface/transformers

We introduce fairseq S2T, a fairseq extension for speech-to-text (S2T) modeling tasks such as end-to-end speech recognition and speech-to-text translation.

END-TO-END SPEECH RECOGNITION MACHINE TRANSLATION MULTI-TASK LEARNING SPEECH RECOGNITION SPEECH-TO-TEXT TRANSLATION

Language Models are Unsupervised Multitask Learners

Preprint 2019 huggingface/transformers

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets.

 Ranked #1 on Language Modelling on enwik8 (using extra training data)

COMMON SENSE REASONING DATA-TO-TEXT GENERATION DOCUMENT SUMMARIZATION LANGUAGE MODELLING MACHINE TRANSLATION MULTI-TASK LEARNING QUESTION ANSWERING READING COMPREHENSION

Pentagon at MEDIQA 2019: Multi-task Learning for Filtering and Re-ranking Answers using Language Inference and Question Entailment

WS 2019 google-research/bert

Parallel deep learning architectures like fine-tuned BERT and MT-DNN, have quickly become the state of the art, bypassing previous deep and shallow learning methods by a large margin.

DOCUMENT CLASSIFICATION MULTI-TASK LEARNING QUESTION ANSWERING

A Single-Shot Arbitrarily-Shaped Text Detector based on Context Attended Multi-Task Learning

15 Aug 2019PaddlePaddle/PaddleOCR

Detecting scene text of arbitrary shapes has been a challenging task over the past years.

MULTI-TASK LEARNING SCENE TEXT

One Model To Learn Them All

16 Jun 2017tensorflow/tensor2tensor

We present a single model that yields good results on a number of problems spanning multiple domains.

IMAGE CAPTIONING IMAGE CLASSIFICATION MULTI-TASK LEARNING

A Unified Framework for Structured Low-rank Matrix Learning

ICML 2018 microsoft/recommenders

We consider the problem of learning a low-rank matrix, constrained to lie in a linear subspace, and introduce a novel factorization for modeling such matrices.

MATRIX COMPLETION MULTI-TASK LEARNING