no code implementations • 21 Feb 2024 • Roy Friedman, Yair Weiss
This has led many to believe that ``GANs capture the training data manifold''.
1 code implementation • 6 Jun 2022 • Rhea Chowers, Yair Weiss
It has previously been reported that the representation that is learned in the first layer of deep Convolutional Neural Networks (CNNs) is highly consistent across initializations and architectures.
2 code implementations • 22 Mar 2022 • Ariel Elnekave, Yair Weiss
On a number of image generation tasks we show that our results are often superior to single-image-GANs, require no training, and can generate high quality images in a few seconds.
no code implementations • NeurIPS 2021 • Daniella Horan, Eitan Richardson, Yair Weiss
In this paper, we show that the assumption of local isometry together with non-Gaussianity of the factors, is sufficient to provably recover disentangled representations from data.
no code implementations • CVPR 2021 • Dan Amir, Yair Weiss
Perceptual metrics based on features of deep Convolutional Neural Networks (CNNs) have shown remarkable success when used as loss functions in a range of computer vision problems and significantly outperform classical losses such as L1 or L2 in pixel space.
1 code implementation • 20 Apr 2021 • Roy Friedman, Yair Weiss
Almost all existing methods for image restoration are based on optimizing the mean squared error (MSE), even though it is known that the best estimate in terms of MSE may yield a highly atypical image due to the fact that there are many plausible restorations for a given noisy image.
2 code implementations • 24 Jul 2020 • Eitan Richardson, Yair Weiss
Unsupervised image-to-image translation is an inherently ill-posed problem.
no code implementations • 20 Feb 2020 • Eitan Richardson, Yair Weiss
Since the discovery of adversarial examples - the ability to fool modern CNN classifiers with tiny perturbations of the input, there has been much discussion whether they are a "bug" that is specific to current neural architectures and training methods or an inevitable "feature" of high dimensional geometry.
1 code implementation • 22 Aug 2018 • Ofer M. Springer, Eran O. Ofek, Yair Weiss, Julian Merten
In this work we report on our initial attempt to reduce statistical errors in weak lensing shear estimation using a machine learning approach -- training a multi-layered convolutional neural network to directly estimate the shear given an observed background galaxy image.
Cosmology and Nongalactic Astrophysics
3 code implementations • NeurIPS 2018 • Eitan Richardson, Yair Weiss
While GMMs have previously been shown to be successful in modeling small patches of images, we show how to train them on full sized images despite the high dimensionality.
4 code implementations • ICLR 2019 • Aharon Azulay, Yair Weiss
Convolutional Neural Networks (CNNs) are commonly assumed to be invariant to small image transformations: either because of the convolutional architecture or because they were trained using data augmentation.
1 code implementation • 20 Feb 2017 • Ofer Springer, Yair Weiss
Photographs taken through a glass surface often contain an approximately linear superposition of reflected and transmitted layers.
no code implementations • 18 Aug 2016 • Elad Mezuman, Yair Weiss
The likelihood function of a finite mixture model is a non-convex function with multiple local maxima and commonly used iterative algorithms such as EM will converge to different solutions depending on initial conditions.
no code implementations • 11 Apr 2016 • Dan Rosenbaum, Yair Weiss
Consistent with current practice, we find that robust versions of gradient constancy are better models than simple brightness constancy but a learned GMM that models the density of patches of warp error gives a much better fit than any existing assumption of constancy.
no code implementations • 11 Apr 2016 • Dan Rosenbaum, Yair Weiss
We then use the generative models together with a degradation model and obtain a Bayes Least Squares (BLS) estimator of the D channel given the RGB channels.
no code implementations • NeurIPS 2015 • Dan Rosenbaum, Yair Weiss
In this paper we show how to combine the strengths of both approaches by training a discriminative, feed-forward architecture to predict the state of latent variables in a generative model of natural images.
no code implementations • NeurIPS 2013 • Dan Rosenbaum, Daniel Zoran, Yair Weiss
Motivated by recent progress in natural image statistics, we use newly available datasets with ground truth optical flow to learn the local statistics of optical flow and rigorously compare the learned model to prior models assumed by computer vision optical flow algorithms.
no code implementations • 26 Sep 2013 • Elad Mezuman, Daniel Tarlow, Amir Globerson, Yair Weiss
In this work, we study the LP relaxations that result from enforcing additional consistency constraints between the HOP and the rest of the model.
1 code implementation • 23 Jan 2013 • Kevin Murphy, Yair Weiss, Michael. I. Jordan
Recently, researchers have demonstrated that loopy belief propagation - the use of Pearls polytree algorithm IN a Bayesian network WITH loops OF error- correcting codes. The most dramatic instance OF this IS the near Shannon - limit performance OF Turbo Codes codes whose decoding algorithm IS equivalent TO loopy belief propagation IN a chain - structured Bayesian network.
no code implementations • NeurIPS 2012 • Elad Mezuman, Yair Weiss
Our results clearly show that the most likely view in the search engine corresponds to the same view preferred by human subjects in experiments.
no code implementations • NeurIPS 2012 • Daniel Zoran, Yair Weiss
Simple Gaussian Mixture Models (GMMs) learned from pixels of natural image patches have been recently shown to be surprisingly strong performers in modeling the statistics of natural images.
no code implementations • NeurIPS 2009 • Daniel Zoran, Yair Weiss
We propose a new model for natural image statistics.
no code implementations • NeurIPS 2009 • Rob Fergus, Yair Weiss, Antonio Torralba
With the advent of the Internet it is now possible to collect hundreds of millions of images.
no code implementations • NeurIPS 2008 • Yair Weiss, Antonio Torralba, Rob Fergus
Semantic hashing seeks compact binary codes of datapoints so that the Hamming distance between codewords correlates with semantic similarity.
1 code implementation • International Conference on Computer vision 2001 • Yair Weiss
We focus on a slightly, easier problem: given a sequence of T images where the reflectance is constant and the illumination changes, can we recover T illumination images and a single reflectance image?