Online Clustering
25 papers with code • 0 benchmarks • 0 datasets
Models that learn to label each image (i.e. cluster the dataset into its ground truth classes) without seeing the ground truth labels. Under the online scenario, data is in the form of streams, i.e., the whole dataset could not be accessed at the same time and the model should be able to make cluster assignments for new data without accessing the former data.
Image Credit: Online Clustering by Penalized Weighted GMM
Benchmarks
These leaderboards are used to track progress in Online Clustering
Most implemented papers
Novel Class Discovery for 3D Point Cloud Semantic Segmentation
Firstly, we address the new problem of NCD for point cloud semantic segmentation.
Hard Regularization to Prevent Deep Online Clustering Collapse without Data Augmentation
We propose a method that does not require data augmentation, and that, differently from existing methods, regularizes the hard assignments.
DinoSR: Self-Distillation and Online Clustering for Self-supervised Speech Representation Learning
In this paper, we introduce self-distillation and online clustering for self-supervised speech representation learning (DinoSR) which combines masked language modeling, self-distillation, and online clustering.
Grid Cell-Inspired Fragmentation and Recall for Efficient Map Building
Agents build and use a local map to predict their observations; high surprisal leads to a "fragmentation event" that truncates the local map.
RGMComm: Return Gap Minimization via Discrete Communications in Multi-Agent Reinforcement Learning
This result enables us to recast multi-agent communication into a novel online clustering problem over the local observations at each agent, with messages as cluster labels and the upper bound on the return gap as clustering loss.