Multi-Task Learning

449 papers with code • 7 benchmarks • 41 datasets

Multi-task learning aims to learn multiple different tasks simultaneously while maximizing performance on one or all of the tasks.

( Image credit: Cross-stitch Networks for Multi-task Learning )

Greatest papers with code

Semi-Supervised Sequence Modeling with Cross-View Training

tensorflow/models EMNLP 2018

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG Supertagging Dependency Parsing +5

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

tensorflow/models 13 Mar 2017

In this work, we present a compact, modular framework for constructing novel recurrent neural architectures.

Dependency Parsing Extractive Summarization +1

Language Models are Unsupervised Multitask Learners

huggingface/transformers Preprint 2019

Natural language processing tasks, such as question answering, machine translation, reading comprehension, and summarization, are typically approached with supervised learning on taskspecific datasets.

 Ranked #1 on Language Modelling on enwik8 (using extra training data)

Common Sense Reasoning Data-to-Text Generation +6

Pentagon at MEDIQA 2019: Multi-task Learning for Filtering and Re-ranking Answers using Language Inference and Question Entailment

google-research/bert WS 2019

Parallel deep learning architectures like fine-tuned BERT and MT-DNN, have quickly become the state of the art, bypassing previous deep and shallow learning methods by a large margin.

Document Classification Multi-Task Learning +2

DSelect-k: Differentiable Selection in the Mixture of Experts with Applications to Multi-Task Learning

google-research/google-research 7 Jun 2021

State-of-the-art MoE models use a trainable sparse gate to select a subset of the experts for each input example.

Multi-Task Learning Recommendation Systems

A Single-Shot Arbitrarily-Shaped Text Detector based on Context Attended Multi-Task Learning

PaddlePaddle/PaddleOCR 15 Aug 2019

Detecting scene text of arbitrary shapes has been a challenging task over the past years.

Multi-Task Learning Scene Text

One Model To Learn Them All

tensorflow/tensor2tensor 16 Jun 2017

We present a single model that yields good results on a number of problems spanning multiple domains.

Image Captioning Image Classification +1

A Unified Framework for Structured Low-rank Matrix Learning

microsoft/recommenders ICML 2018

We consider the problem of learning a low-rank matrix, constrained to lie in a linear subspace, and introduce a novel factorization for modeling such matrices.

Matrix Completion Multi-Task Learning

Language Models are Few-Shot Learners

openai/gpt-3 NeurIPS 2020

By contrast, humans can generally perform a new language task from only a few examples or from simple instructions - something which current NLP systems still largely struggle to do.

Common Sense Reasoning Coreference Resolution +9