1 code implementation • ICCV 2019 • Mehran Khodabandeh, Arash Vahdat, Mani Ranjbar, William G. Macready
To adapt to the domain shift, the model is trained on the target domain using a set of noisy object bounding boxes that are obtained by a detection model trained only in the source domain.
3 code implementations • ICML 2020 • Arash Vahdat, Evgeny Andriyash, William G. Macready
We extend the class of posterior models that may be learned by using undirected graphical models.
no code implementations • CVPR 2020 • Mostafa S. Ibrahim, Arash Vahdat, Mani Ranjbar, William G. Macready
Building a large image dataset with high-quality object masks for semantic segmentation is costly and time consuming.
no code implementations • NeurIPS 2018 • Arash Vahdat, Evgeny Andriyash, William G. Macready
Experiments on the MNIST and OMNIGLOT datasets show that these relaxations outperform previous discrete VAEs with Boltzmann priors.
no code implementations • ICML 2018 • Arash Vahdat, William G. Macready, Zhengbing Bian, Amir Khoshaman, Evgeny Andriyash
Training of discrete latent variable models remains challenging because passing gradient information through discrete units is difficult.
Ranked #53 on Image Generation on CIFAR-10 (bits/dimension metric)
no code implementations • 14 Nov 2016 • Dmytro Korenkevych, Yanbo Xue, Zhengbing Bian, Fabian Chudak, William G. Macready, Jason Rolfe, Evgeny Andriyash
We argue that this relates to the fact that we are training a quantum rather than classical Boltzmann distribution in this case.
1 code implementation • 10 Jun 2014 • Jun Cai, William G. Macready, Aidan Roy
We present a heuristic algorithm for finding a graph $H$ as a minor of a graph $G$ that is practical for sparse $G$ and $H$ with hundreds of vertices.
Quantum Physics Data Structures and Algorithms Combinatorics 05C83, 81P68
1 code implementation • 12 Apr 2012 • Vadim N. Smelyanskiy, Eleanor G. Rieffel, Sergey I. Knysh, Colin P. Williams, Mark W. Johnson, Murray C. Thom, William G. Macready, Kristen L. Pudenz
We review quantum algorithms for structured learning for multi-label classification and introduce a hybrid classical/quantum approach for learning the weights.
Quantum Physics
2 code implementations • 4 Nov 2008 • Hartmut Neven, Vasil S. Denchev, Geordie Rose, William G. Macready
To bring it into a format that allows the application of adiabatic quantum computing (AQC), we first show that the bit-precision with which the weights need to be represented only grows logarithmically with the ratio of the number of training examples to the number of weak classifiers.
Quantum Physics