Browse > Methodology > Transfer Learning > Multi-Task Learning

Multi-Task Learning

88 papers with code · Methodology
Subtask of Transfer Learning

Multi-task learning aims to learn multiple different tasks simultaneously while maximizing performance on one or all of the tasks.

State-of-the-art leaderboards

Greatest papers with code

DRAGNN: A Transition-based Framework for Dynamically Connected Neural Networks

13 Mar 2017tensorflow/models

In this work, we present a compact, modular framework for constructing novel recurrent neural architectures.

DEPENDENCY PARSING MULTI-TASK LEARNING

Semi-Supervised Sequence Modeling with Cross-View Training

EMNLP 2018 tensorflow/models

We therefore propose Cross-View Training (CVT), a semi-supervised learning algorithm that improves the representations of a Bi-LSTM sentence encoder using a mix of labeled and unlabeled data.

CCG SUPERTAGGING DEPENDENCY PARSING MACHINE TRANSLATION MULTI-TASK LEARNING NAMED ENTITY RECOGNITION (NER) UNSUPERVISED REPRESENTATION LEARNING

One Model To Learn Them All

16 Jun 2017tensorflow/tensor2tensor

We present a single model that yields good results on a number of problems spanning multiple domains.

IMAGE CAPTIONING IMAGE CLASSIFICATION MULTI-TASK LEARNING

Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

ICLR 2018 facebookresearch/InferSent

In this work, we present a simple, effective multi-task learning framework for sentence representations that combines the inductive biases of diverse training objectives in a single model.

MULTI-TASK LEARNING NATURAL LANGUAGE INFERENCE PARAPHRASE IDENTIFICATION SEMANTIC TEXTUAL SIMILARITY

A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks

14 Nov 2018huggingface/hmtl

The model is trained in a hierarchical fashion to introduce an inductive bias by supervising a set of low level tasks at the bottom layers of the model and more complex tasks at the top layers of the model.

MULTI-TASK LEARNING NAMED ENTITY RECOGNITION (NER) RELATION EXTRACTION

An Overview of Multi-Task Learning in Deep Neural Networks

15 Jun 2017HazyResearch/metal

Multi-task learning (MTL) has led to successes in many applications of machine learning, from natural language processing and speech recognition to computer vision and drug discovery.

DRUG DISCOVERY MULTI-TASK LEARNING SPEECH RECOGNITION

HyperFace: A Deep Multi-task Learning Framework for Face Detection, Landmark Localization, Pose Estimation, and Gender Recognition

3 Mar 2016takiyu/hyperface

We present an algorithm for simultaneous face detection, landmarks localization, pose estimation and gender recognition using deep convolutional neural networks (CNN).

FACE DETECTION MULTI-TASK LEARNING POSE ESTIMATION

Decoupled Classification Refinement: Hard False Positive Suppression for Object Detection

5 Oct 2018bowenc0221/Decoupled-Classification-Refinement

In particular, DCR places a separate classification network in parallel with the localization network (base detector).

MULTI-TASK LEARNING OBJECT DETECTION

Linguistically-Informed Self-Attention for Semantic Role Labeling

EMNLP 2018 strubell/LISA

Unlike previous models which require significant pre-processing to prepare linguistic features, LISA can incorporate syntax using merely raw tokens as input, encoding the sequence only once to simultaneously perform parsing, predicate detection and role labeling for all predicates.

DEPENDENCY PARSING MULTI-TASK LEARNING PART-OF-SPEECH TAGGING PREDICATE DETECTION SEMANTIC ROLE LABELING (PREDICTED PREDICATES) WORD EMBEDDINGS