1 code implementation • 9 Dec 2024 • Yaron Lipman, Marton Havasi, Peter Holderrieth, Neta Shaul, Matt Le, Brian Karrer, Ricky T. Q. Chen, David Lopez-Paz, Heli Ben-Hamu, Itai Gat
Flow Matching (FM) is a recent framework for generative modeling that has achieved state-of-the-art performance across various domains, including image, video, audio, speech, and biological structures.
no code implementations • 4 Dec 2024 • Neta Shaul, Itai Gat, Marton Havasi, Daniel Severo, Anuroop Sriram, Peter Holderrieth, Brian Karrer, Yaron Lipman, Ricky T. Q. Chen
Through the lens of optimizing the symmetric kinetic energy, we propose velocity formulas that can be applied to any given probability path, completely decoupling the probability and velocity, and giving the user the freedom to specify any desirable probability path based on expert knowledge specific to the data domain.
no code implementations • 27 Oct 2024 • Peter Holderrieth, Marton Havasi, Jason Yim, Neta Shaul, Itai Gat, Tommi Jaakkola, Brian Karrer, Ricky T. Q. Chen, Yaron Lipman
We introduce generator matching, a modality-agnostic framework for generative modeling using arbitrary Markov processes.
no code implementations • 22 Jul 2024 • Itai Gat, Tal Remez, Neta Shaul, Felix Kreuk, Ricky T. Q. Chen, Gabriel Synnaeve, Yossi Adi, Yaron Lipman
Despite Flow Matching and diffusion models having emerged as powerful generative paradigms for continuous variables such as images and videos, their application to high-dimensional discrete data, such as language, is still limited.
no code implementations • 2 Mar 2024 • Neta Shaul, Uriel Singer, Ricky T. Q. Chen, Matthew Le, Ali Thabet, Albert Pumarola, Yaron Lipman
This paper introduces Bespoke Non-Stationary (BNS) Solvers, a solver distillation approach to improve sample efficiency of Diffusion and Flow models.
2 code implementations • 21 Feb 2024 • Heli Ben-Hamu, Omri Puny, Itai Gat, Brian Karrer, Uriel Singer, Yaron Lipman
Taming the generation outcome of state of the art Diffusion and Flow-Matching (FM) models without having to re-train a task-specific model unlocks a powerful tool for solving inverse problems, conditional generation, and controlled generation in general.
no code implementations • CVPR 2024 • Lior Yariv, Omri Puny, Natalia Neverova, Oran Gafni, Yaron Lipman
Current diffusion or flow-based generative models for 3D shapes divide to two: distilling pre-trained 2D image diffusion models, and training directly on 3D shapes.
no code implementations • 22 Nov 2023 • Qinqing Zheng, Matt Le, Neta Shaul, Yaron Lipman, Aditya Grover, Ricky T. Q. Chen
Classifier-free guidance is a key component for enhancing the performance of conditional generative models across diverse tasks.
no code implementations • 29 Oct 2023 • Neta Shaul, Juan Perez, Ricky T. Q. Chen, Ali Thabet, Albert Pumarola, Yaron Lipman
For example, a Bespoke solver for a CIFAR10 model produces samples with Fr\'echet Inception Distance (FID) of 2. 73 with 10 NFE, and gets to 1% of the Ground Truth (GT) FID (2. 59) for this model with only 20 NFE.
1 code implementation • 3 Oct 2023 • Guan-Horng Liu, Yaron Lipman, Maximilian Nickel, Brian Karrer, Evangelos A. Theodorou, Ricky T. Q. Chen
Modern distribution matching algorithms for training diffusion or flow models directly prescribe the time evolution of the marginal distributions between two boundary distributions.
no code implementations • 11 Jun 2023 • Neta Shaul, Ricky T. Q. Chen, Maximilian Nickel, Matt Le, Yaron Lipman
We investigate Kinetic Optimal (KO) Gaussian paths and offer the following observations: (i) We show the KE takes a simplified form on the space of Gaussian paths, where the data is incorporated only through a single, one dimensional scalar function, called the \emph{data separation function}.
no code implementations • 28 Apr 2023 • Aram-Alexandre Pooladian, Heli Ben-Hamu, Carles Domingo-Enrich, Brandon Amos, Yaron Lipman, Ricky T. Q. Chen
Simulation-free methods for training continuous-time generative models construct probability paths that go between noise distributions and individual data samples.
no code implementations • 25 Mar 2023 • Albert Pumarola, Artsiom Sanakoyeu, Lior Yariv, Ali Thabet, Yaron Lipman
Surface reconstruction has been seeing a lot of progress lately by utilizing Implicit Neural Representations (INRs).
no code implementations • 22 Feb 2023 • Omri Puny, Derek Lim, Bobak T. Kiani, Haggai Maron, Yaron Lipman
This paper introduces an alternative expressive power hierarchy based on the ability of GNNs to calculate equivariant polynomials of a certain degree.
3 code implementations • 16 Feb 2023 • Omer Bar-Tal, Lior Yariv, Yaron Lipman, Tali Dekel
In this work, we present MultiDiffusion, a unified framework that enables versatile and controllable image generation, using a pre-trained text-to-image diffusion model, without any further training or finetuning.
4 code implementations • 7 Feb 2023 • Ricky T. Q. Chen, Yaron Lipman
To extend to general geometries, we rely on the use of spectral decompositions to efficiently compute premetrics on the fly.
4 code implementations • 6 Oct 2022 • Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, Matt Le
These paths are more efficient than diffusion paths, provide faster training and sampling, and result in better generalization.
Ranked #5 on Image Generation on ImageNet 32x32
1 code implementation • 4 Oct 2022 • Jack Richter-Powell, Yaron Lipman, Ricky T. Q. Chen
We investigate the parameterization of deep neural networks that by design satisfy the continuity equation, a fundamental conservation law.
no code implementations • 11 Jul 2022 • Heli Ben-Hamu, samuel cohen, Joey Bose, Brandon Amos, Aditya Grover, Maximilian Nickel, Ricky T. Q. Chen, Yaron Lipman
Continuous Normalizing Flows (CNFs) are a class of generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE).
no code implementations • 18 Dec 2021 • Christopher Morris, Yaron Lipman, Haggai Maron, Bastian Rieck, Nils M. Kriege, Martin Grohe, Matthias Fey, Karsten Borgwardt
In recent years, algorithms and neural architectures based on the Weisfeiler--Leman algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a powerful tool for machine learning with graphs and relational data.
no code implementations • CVPR 2022 • Matan Atzmon, Koki Nagano, Sanja Fidler, Sameh Khamis, Yaron Lipman
A natural way to incorporate symmetries in shape space learning is to ask that the mapping to the shape space (encoder) and mapping from the shape space (decoder) are equivariant to the relevant symmetries.
1 code implementation • ICLR 2022 • Omri Puny, Matan Atzmon, Heli Ben-Hamu, Ishan Misra, Aditya Grover, Edward J. Smith, Yaron Lipman
For example, Euclidean motion invariant/equivariant graph or point cloud neural networks.
no code implementations • 19 Aug 2021 • Matan Atzmon, David Novotny, Andrea Vedaldi, Yaron Lipman
Implicit neural representation is a recent approach to learn shape collections as zero level-sets of neural networks, where each shape is represented by a latent code.
1 code implementation • NeurIPS 2021 • Noam Rozen, Aditya Grover, Maximilian Nickel, Yaron Lipman
MF also produces a CNF via a solution to the change-of-variable formula, however differently from other CNF methods, its model (learned) density is parameterized as the source (prior) density minus the divergence of a neural network (NN).
3 code implementations • NeurIPS 2021 • Lior Yariv, Jiatao Gu, Yoni Kasten, Yaron Lipman
Accurate sampling is important to provide a precise coupling of geometry and radiance; and (iii) it allows efficient unsupervised disentanglement of shape and appearance in volume rendering.
1 code implementation • 18 Jun 2021 • samuel cohen, Brandon Amos, Yaron Lipman
Modeling distributions on Riemannian manifolds is a crucial component in understanding non-Euclidean data that arises, e. g., in physics and geology.
1 code implementation • 14 Jun 2021 • Yaron Lipman
Representing surfaces as zero level sets of neural networks recently emerged as a powerful modeling paradigm, named Implicit Neural Representations (INRs), serving numerous downstream applications in geometric deep learning and 3D vision.
1 code implementation • 6 Aug 2020 • Jonathan Shlomi, Sanmay Ganguly, Eilam Gross, Kyle Cranmer, Yaron Lipman, Hadar Serviansky, Haggai Maron, Nimrod Segol
Jet classification is an important ingredient in measurements and searches for new physics at particle coliders, and secondary vertex reconstruction is a key intermediate step in building powerful jet classifiers.
High Energy Physics - Experiment High Energy Physics - Phenomenology
no code implementations • 16 Jun 2020 • Amos Gropp, Matan Atzmon, Yaron Lipman
Two sources of bad generalization are: extrinsic, where the learned manifold possesses extraneous parts that are far from the data; and intrinsic, where the encoder and decoder introduce arbitrary distortion in the low dimensional parameterization.
3 code implementations • 14 Jun 2020 • Omri Puny, Heli Ben-Hamu, Yaron Lipman
This paper advocates incorporating a Low-Rank Global Attention (LRGA) module, a computation and memory efficient variant of the dot-product attention (Vaswani et al., 2017), to Graph Neural Networks (GNNs) for improving their generalization power.
Ranked #1 on Link Property Prediction on ogbl-ddi
1 code implementation • ICLR 2021 • Matan Atzmon, Yaron Lipman
Learning 3D geometry directly from raw data, such as point clouds, triangle soups, or unoriented meshes is still a challenging task that feeds many downstream computer vision and graphics applications.
3 code implementations • NeurIPS 2020 • Lior Yariv, Yoni Kasten, Dror Moran, Meirav Galun, Matan Atzmon, Ronen Basri, Yaron Lipman
In this work we address the challenging problem of multiview 3D surface reconstruction.
5 code implementations • ICML 2020 • Amos Gropp, Lior Yariv, Niv Haim, Matan Atzmon, Yaron Lipman
Representing shapes as level sets of neural networks has been recently proved to be useful for different shape analysis and reconstruction tasks.
1 code implementation • NeurIPS 2020 • Hadar Serviansky, Nimrod Segol, Jonathan Shlomi, Kyle Cranmer, Eilam Gross, Haggai Maron, Yaron Lipman
Many problems in machine learning can be cast as learning functions from sets to graphs, or more generally to hypergraphs; in short, Set2Graph functions.
1 code implementation • CVPR 2020 • Matan Atzmon, Yaron Lipman
Recently, neural networks have been used as implicit representations for surface reconstruction, modelling, learning, and generation.
1 code implementation • ICLR 2020 • Nimrod Segol, Yaron Lipman
The key theoretical tool used to prove the above results is an explicit characterization of all permutation equivariant polynomial layers.
2 code implementations • NeurIPS 2019 • Matan Atzmon, Niv Haim, Lior Yariv, Ofer Israelov, Haggai Maron, Yaron Lipman
In turn, the sample network can be used to incorporate the level set samples into a loss function of interest.
2 code implementations • NeurIPS 2019 • Haggai Maron, Heli Ben-Hamu, Hadar Serviansky, Yaron Lipman
It was shown that the popular message passing GNN cannot distinguish between graphs that are indistinguishable by the 1-WL test (Morris et al. 2018; Xu et al. 2019).
Ranked #6 on Graph Classification on COLLAB
no code implementations • 27 Jan 2019 • Haggai Maron, Ethan Fetaya, Nimrod Segol, Yaron Lipman
We conclude the paper by proving a necessary condition for the universality of $G$-invariant networks that incorporate only first-order tensors.
1 code implementation • ICCV 2019 • Niv Haim, Nimrod Segol, Heli Ben-Hamu, Haggai Maron, Yaron Lipman
Specifically, for the use case of learning spherical signals, our representation provides a low distortion alternative to several popular spherical parameterizations used in deep learning.
no code implementations • ICLR 2019 • Haggai Maron, Heli Ben-Hamu, Nadav Shamir, Yaron Lipman
In this paper we provide a characterization of all permutation invariant and equivariant linear layers for (hyper-)graph data, and show that their dimension, in case of edge-value graph data, is 2 and 15, respectively.
1 code implementation • 6 Jun 2018 • Heli Ben-Hamu, Haggai Maron, Itay Kezurer, Gal Avineri, Yaron Lipman
The new tensor data representation is used as input to Generative Adversarial Networks for the task of 3D shape generation.
1 code implementation • 27 Mar 2018 • Matan Atzmon, Haggai Maron, Yaron Lipman
This paper presents Point Convolutional Neural Networks (PCNN): a novel framework for applying convolutional neural networks to point clouds.
Ranked #86 on 3D Point Cloud Classification on ModelNet40
no code implementations • 25 Jun 2017 • Ofer Bartal, Nati Ofir, Yaron Lipman, Ronen Basri
We present a novel embedding method that maps pixels to normals on the unit hemisphere.
no code implementations • ICCV 2015 • Meirav Galun, Tal Amir, Tal Hassner, Ronen Basri, Yaron Lipman
This paper focuses on the challenging problem of finding correspondences once approximate epipolar constraints are given.
no code implementations • 10 Jun 2015 • Meirav Galun, Tal Amir, Tal Hassner, Ronen Basri, Yaron Lipman
This paper focuses on the challenging problem of finding correspondences once approximate epipolar constraints are given.