1 code implementation • ICCV 2023 • TONG LIANG, Jim Davis
There is a recently discovered and intriguing phenomenon called Neural Collapse: at the terminal phase of training a deep neural network for classification, the within-class penultimate feature means and the associated classifier vectors of all flat classes collapse to the vertices of a simplex Equiangular Tight Frame (ETF).
no code implementations • 5 Oct 2021 • TONG LIANG, Jim Davis, Roman Ilin
In this work, we propose a method to efficiently compute label posteriors of a base flat classifier in the presence of few validation examples within a bottom-up hierarchical inference framework.
no code implementations • 1 Jan 2021 • Yizhou Chen, Dong Li, Na Li, TONG LIANG, Shizhuo Zhang, Bryan Kian Hsiang Low
This paper presents a novel implicit process-based meta-learning (IPML) algorithm that, in contrast to existing works, explicitly represents each task as a continuous latent vector and models its probabilistic belief within the highly expressive IP framework.