Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks

ICLR 2019 Juho LeeYoonho LeeJungtaek KimAdam R. KosiorekSeungjin ChoiYee Whye Teh

Many machine learning tasks such as multiple instance learning, 3D shape recognition, and few-shot image classification are defined on sets of instances. Since solutions to such problems do not depend on the order of elements of the set, models used to address them should be permutation invariant... (read more)

PDF Abstract

Results from the Paper

  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.