no code implementations • ICLR 2018 • Adi Hayat, Mark Kliger, Shachar Fleishman, Daniel Cohen-Or
We present a simple yet powerful hard distillation method where the base network is augmented with additional weights to classify the novel classes, while keeping the weights of the base network unchanged.