2 code implementations • ICCV 2017 • Philip Haeusser, Thomas Frerix, Alexander Mordvintsev, Daniel Cremers
Our training scheme follows the paradigm that in order to effectively derive class labels for the target domain, a network should produce statistically domain invariant embeddings, while minimizing the classification error on the labeled source domain.
Ranked #6 on Domain Adaptation on SYNSIG-to-GTSRB
no code implementations • CVPR 2017 • Philip Haeusser, Alexander Mordvintsev, Daniel Cremers
We demonstrate the capabilities of learning by association on several data sets and show that it can improve performance on classification tasks tremendously by making use of additionally available unlabeled data.
no code implementations • 23 May 2017 • Karol Kurach, Sylvain Gelly, Michal Jastrzebski, Philip Haeusser, Olivier Teytaud, Damien Vincent, Olivier Bousquet
Generic text embeddings are successfully used in a variety of tasks.