no code implementations • 14 Dec 2023 • Lior Yariv, Omri Puny, Natalia Neverova, Oran Gafni, Yaron Lipman
Current diffusion or flow-based generative models for 3D shapes divide to two: distilling pre-trained 2D image diffusion models, and training directly on 3D shapes.
no code implementations • 25 Mar 2023 • Albert Pumarola, Artsiom Sanakoyeu, Lior Yariv, Ali Thabet, Yaron Lipman
Surface reconstruction has been seeing a lot of progress lately by utilizing Implicit Neural Representations (INRs).
no code implementations • 28 Feb 2023 • Lior Yariv, Peter Hedman, Christian Reiser, Dor Verbin, Pratul P. Srinivasan, Richard Szeliski, Jonathan T. Barron, Ben Mildenhall
We present a method for reconstructing high-quality meshes of large unbounded real-world scenes suitable for photorealistic novel view synthesis.
2 code implementations • 16 Feb 2023 • Omer Bar-Tal, Lior Yariv, Yaron Lipman, Tali Dekel
In this work, we present MultiDiffusion, a unified framework that enables versatile and controllable image generation, using a pre-trained text-to-image diffusion model, without any further training or finetuning.
3 code implementations • NeurIPS 2021 • Lior Yariv, Jiatao Gu, Yoni Kasten, Yaron Lipman
Accurate sampling is important to provide a precise coupling of geometry and radiance; and (iii) it allows efficient unsupervised disentanglement of shape and appearance in volume rendering.
3 code implementations • NeurIPS 2020 • Lior Yariv, Yoni Kasten, Dror Moran, Meirav Galun, Matan Atzmon, Ronen Basri, Yaron Lipman
In this work we address the challenging problem of multiview 3D surface reconstruction.
4 code implementations • ICML 2020 • Amos Gropp, Lior Yariv, Niv Haim, Matan Atzmon, Yaron Lipman
Representing shapes as level sets of neural networks has been recently proved to be useful for different shape analysis and reconstruction tasks.
2 code implementations • NeurIPS 2019 • Matan Atzmon, Niv Haim, Lior Yariv, Ofer Israelov, Haggai Maron, Yaron Lipman
In turn, the sample network can be used to incorporate the level set samples into a loss function of interest.