Metric Learning

558 papers with code • 8 benchmarks • 32 datasets

The goal of Metric Learning is to learn a representation function that maps objects into an embedded space. The distance in the embedded space should preserve the objects’ similarity — similar objects get close and dissimilar objects get far away. Various loss functions have been developed for Metric Learning. For example, the contrastive loss guides the objects from the same class to be mapped to the same point and those from different classes to be mapped to different points whose distances are larger than a margin. Triplet loss is also popular, which requires the distance between the anchor sample and the positive sample to be smaller than the distance between the anchor sample and the negative sample.

Source: Road Network Metric Learning for Estimated Time of Arrival

Libraries

Use these libraries to find Metric Learning models and implementations

Most implemented papers

Classification is a Strong Baseline for Deep Metric Learning

azgo14/classification_metric_learning 30 Nov 2018

Deep metric learning aims to learn a function mapping image pixels to embedding feature vectors that model the similarity between images.

Ranked List Loss for Deep Metric Learning

XinshaoAmosWang/Ranked-List-Loss-for-DML CVPR 2019

To address this, we propose to build a set-based similarity structure by exploiting all instances in the gallery.

Hardness-Aware Deep Metric Learning

wzzheng/HDML CVPR 2019

This paper presents a hardness-aware deep metric learning (HDML) framework.

Correlation Congruence for Knowledge Distillation

yoshitomo-matsubara/torchdistill ICCV 2019

Most teacher-student frameworks based on knowledge distillation (KD) depend on a strong congruent constraint on instance level.

Multi-Similarity Loss with General Pair Weighting for Deep Metric Learning

MalongTech/research-ms-loss CVPR 2019

A family of loss functions built on pair-based computation have been proposed in the literature which provide a myriad of solutions for deep metric learning.

Learning to Approximate a Bregman Divergence

Siahkamari/Learning-Bregman-Divergences NeurIPS 2020

Bregman divergences generalize measures such as the squared Euclidean distance and the KL divergence, and arise throughout many areas of machine learning.

MIC: Mining Interclass Characteristics for Improved Metric Learning

Confusezius/metric-learning-mining-interclass-characteristics ICCV 2019

In contrast, we propose to explicitly learn the latent characteristics that are shared by and go across object classes.

R2D2: Reliable and Repeatable Detector and Descriptor

naver/r2d2 NeurIPS 2019

We thus propose to jointly learn keypoint detection and description together with a predictor of the local descriptor discriminativeness.

The Group Loss for Deep Metric Learning

dvl-tum/group_loss ECCV 2020

Deep metric learning has yielded impressive results in tasks such as clustering and image retrieval by leveraging neural networks to obtain highly discriminative feature embeddings, which can be used to group samples into different classes.

Associative Alignment for Few-shot Image Classification

ML-Bee/associative-alignment-fs ECCV 2020

Few-shot image classification aims at training a model from only a few examples for each of the "novel" classes.