A Probabilistic approach for Learning Embeddings without Supervision

17 Dec 2019  ·  Ujjal Kr Dutta, Mehrtash Harandi, Chandra Sekhar Chellu ·

For challenging machine learning problems such as zero-shot learning and fine-grained categorization, embedding learning is the machinery of choice because of its ability to learn generic notions of similarity, as opposed to class-specific concepts in standard classification models. Embedding learning aims at learning discriminative representations of data such that similar examples are pulled closer, while pushing away dissimilar ones. Despite their exemplary performances, supervised embedding learning approaches require huge number of annotations for training. This restricts their applicability for large datasets in new applications where obtaining labels require extensive manual efforts and domain knowledge. In this paper, we propose to learn an embedding in a completely unsupervised manner without using any class labels. Using a graph-based clustering approach to obtain pseudo-labels, we form triplet-based constraints following a metric learning paradigm. Our novel embedding learning approach uses a probabilistic notion, that intuitively minimizes the chances of each triplet violating a geometric constraint. Due to nature of the search space, we learn the parameters of our approach using Riemannian geometry. Our proposed approach performs competitive to state-of-the-art approaches.

PDF Abstract

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here