Search Results for author: Yaron Lipman

Found 42 papers, 22 papers with code

MultiDiffusion: Fusing Diffusion Paths for Controlled Image Generation

2 code implementations16 Feb 2023 Omer Bar-Tal, Lior Yariv, Yaron Lipman, Tali Dekel

In this work, we present MultiDiffusion, a unified framework that enables versatile and controllable image generation, using a pre-trained text-to-image diffusion model, without any further training or finetuning.

Text-to-Image Generation

Volume Rendering of Neural Implicit Surfaces

3 code implementations NeurIPS 2021 Lior Yariv, Jiatao Gu, Yoni Kasten, Yaron Lipman

Accurate sampling is important to provide a precise coupling of geometry and radiance; and (iii) it allows efficient unsupervised disentanglement of shape and appearance in volume rendering.

Disentanglement Inductive Bias

Implicit Geometric Regularization for Learning Shapes

4 code implementations ICML 2020 Amos Gropp, Lior Yariv, Niv Haim, Matan Atzmon, Yaron Lipman

Representing shapes as level sets of neural networks has been recently proved to be useful for different shape analysis and reconstruction tasks.

Flow Matching for Generative Modeling

1 code implementation6 Oct 2022 Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, Matt Le

These paths are more efficient than diffusion paths, provide faster training and sampling, and result in better generalization.

Density Estimation

Flow Matching on General Geometries

2 code implementations7 Feb 2023 Ricky T. Q. Chen, Yaron Lipman

To extend to general geometries, we rely on the use of spectral decompositions to efficiently compute premetrics on the fly.

Controlling Neural Level Sets

2 code implementations NeurIPS 2019 Matan Atzmon, Niv Haim, Lior Yariv, Ofer Israelov, Haggai Maron, Yaron Lipman

In turn, the sample network can be used to incorporate the level set samples into a loss function of interest.

Surface Reconstruction

SAL: Sign Agnostic Learning of Shapes from Raw Data

1 code implementation CVPR 2020 Matan Atzmon, Yaron Lipman

Recently, neural networks have been used as implicit representations for surface reconstruction, modelling, learning, and generation.

Surface Reconstruction

SALD: Sign Agnostic Learning with Derivatives

1 code implementation ICLR 2021 Matan Atzmon, Yaron Lipman

Learning 3D geometry directly from raw data, such as point clouds, triangle soups, or unoriented meshes is still a challenging task that feeds many downstream computer vision and graphics applications.

regression valid

Riemannian Convex Potential Maps

1 code implementation18 Jun 2021 samuel cohen, Brandon Amos, Yaron Lipman

Modeling distributions on Riemannian manifolds is a crucial component in understanding non-Euclidean data that arises, e. g., in physics and geology.

Point Convolutional Neural Networks by Extension Operators

1 code implementation27 Mar 2018 Matan Atzmon, Haggai Maron, Yaron Lipman

This paper presents Point Convolutional Neural Networks (PCNN): a novel framework for applying convolutional neural networks to point clouds.

3D Part Segmentation 3D Point Cloud Classification +2

Multi-chart Generative Surface Modeling

1 code implementation6 Jun 2018 Heli Ben-Hamu, Haggai Maron, Itay Kezurer, Gal Avineri, Yaron Lipman

The new tensor data representation is used as input to Generative Adversarial Networks for the task of 3D shape generation.

3D Shape Generation Translation

Provably Powerful Graph Networks

2 code implementations NeurIPS 2019 Haggai Maron, Heli Ben-Hamu, Hadar Serviansky, Yaron Lipman

It was shown that the popular message passing GNN cannot distinguish between graphs that are indistinguishable by the 1-WL test (Morris et al. 2018; Xu et al. 2019).

Graph Classification Graph Learning +1

Neural Conservation Laws: A Divergence-Free Perspective

1 code implementation4 Oct 2022 Jack Richter-Powell, Yaron Lipman, Ricky T. Q. Chen

We investigate the parameterization of deep neural networks that by design satisfy the continuity equation, a fundamental conservation law.

Set2Graph: Learning Graphs From Sets

1 code implementation NeurIPS 2020 Hadar Serviansky, Nimrod Segol, Jonathan Shlomi, Kyle Cranmer, Eilam Gross, Haggai Maron, Yaron Lipman

Many problems in machine learning can be cast as learning functions from sets to graphs, or more generally to hypergraphs; in short, Set2Graph functions.

BIG-bench Machine Learning Clustering

Generalized Schrödinger Bridge Matching

1 code implementation3 Oct 2023 Guan-Horng Liu, Yaron Lipman, Maximilian Nickel, Brian Karrer, Evangelos A. Theodorou, Ricky T. Q. Chen

Modern distribution matching algorithms for training diffusion or flow models directly prescribe the time evolution of the marginal distributions between two boundary distributions.

Moser Flow: Divergence-based Generative Modeling on Manifolds

1 code implementation NeurIPS 2021 Noam Rozen, Aditya Grover, Maximilian Nickel, Yaron Lipman

MF also produces a CNF via a solution to the change-of-variable formula, however differently from other CNF methods, its model (learned) density is parameterized as the source (prior) density minus the divergence of a neural network (NN).

Density Estimation

Global Attention Improves Graph Networks Generalization

3 code implementations14 Jun 2020 Omri Puny, Heli Ben-Hamu, Yaron Lipman

This paper advocates incorporating a Low-Rank Global Attention (LRGA) module, a computation and memory efficient variant of the dot-product attention (Vaswani et al., 2017), to Graph Neural Networks (GNNs) for improving their generalization power.

On Universal Equivariant Set Networks

1 code implementation ICLR 2020 Nimrod Segol, Yaron Lipman

The key theoretical tool used to prove the above results is an explicit characterization of all permutation equivariant polynomial layers.

Point Cloud Segmentation

Surface Networks via General Covers

1 code implementation ICCV 2019 Niv Haim, Nimrod Segol, Heli Ben-Hamu, Haggai Maron, Yaron Lipman

Specifically, for the use case of learning spherical signals, our representation provides a low distortion alternative to several popular spherical parameterizations used in deep learning.

Retrieval

Phase Transitions, Distance Functions, and Implicit Neural Representations

1 code implementation14 Jun 2021 Yaron Lipman

Representing surfaces as zero level sets of neural networks recently emerged as a powerful modeling paradigm, named Implicit Neural Representations (INRs), serving numerous downstream applications in geometric deep learning and 3D vision.

Inductive Bias Surface Reconstruction

Secondary Vertex Finding in Jets with Neural Networks

1 code implementation6 Aug 2020 Jonathan Shlomi, Sanmay Ganguly, Eilam Gross, Kyle Cranmer, Yaron Lipman, Hadar Serviansky, Haggai Maron, Nimrod Segol

Jet classification is an important ingredient in measurements and searches for new physics at particle coliders, and secondary vertex reconstruction is a key intermediate step in building powerful jet classifiers.

High Energy Physics - Experiment High Energy Physics - Phenomenology

Photometric Stereo by Hemispherical Metric Embedding

no code implementations25 Jun 2017 Ofer Bartal, Nati Ofir, Yaron Lipman, Ronen Basri

We present a novel embedding method that maps pixels to normals on the unit hemisphere.

Wide baseline stereo matching with convex bounded-distortion constraints

no code implementations10 Jun 2015 Meirav Galun, Tal Amir, Tal Hassner, Ronen Basri, Yaron Lipman

This paper focuses on the challenging problem of finding correspondences once approximate epipolar constraints are given.

Stereo Matching Stereo Matching Hand

Invariant and Equivariant Graph Networks

no code implementations ICLR 2019 Haggai Maron, Heli Ben-Hamu, Nadav Shamir, Yaron Lipman

In this paper we provide a characterization of all permutation invariant and equivariant linear layers for (hyper-)graph data, and show that their dimension, in case of edge-value graph data, is 2 and 15, respectively.

2k

On the Universality of Invariant Networks

no code implementations27 Jan 2019 Haggai Maron, Ethan Fetaya, Nimrod Segol, Yaron Lipman

We conclude the paper by proving a necessary condition for the universality of $G$-invariant networks that incorporate only first-order tensors.

Isometric Autoencoders

no code implementations16 Jun 2020 Amos Gropp, Matan Atzmon, Yaron Lipman

Two sources of bad generalization are: extrinsic, where the learned manifold possesses extraneous parts that are far from the data; and intrinsic, where the encoder and decoder introduce arbitrary distortion in the low dimensional parameterization.

Dimensionality Reduction

Augmenting Implicit Neural Shape Representations with Explicit Deformation Fields

no code implementations19 Aug 2021 Matan Atzmon, David Novotny, Andrea Vedaldi, Yaron Lipman

Implicit neural representation is a recent approach to learn shape collections as zero level-sets of neural networks, where each shape is represented by a latent code.

Frame Averaging for Equivariant Shape Space Learning

no code implementations CVPR 2022 Matan Atzmon, Koki Nagano, Sanja Fidler, Sameh Khamis, Yaron Lipman

A natural way to incorporate symmetries in shape space learning is to ask that the mapping to the shape space (encoder) and mapping from the shape space (decoder) are equivariant to the relevant symmetries.

Weisfeiler and Leman go Machine Learning: The Story so far

no code implementations18 Dec 2021 Christopher Morris, Yaron Lipman, Haggai Maron, Bastian Rieck, Nils M. Kriege, Martin Grohe, Matthias Fey, Karsten Borgwardt

In recent years, algorithms and neural architectures based on the Weisfeiler--Leman algorithm, a well-known heuristic for the graph isomorphism problem, have emerged as a powerful tool for machine learning with graphs and relational data.

BIG-bench Machine Learning Representation Learning

Matching Normalizing Flows and Probability Paths on Manifolds

no code implementations11 Jul 2022 Heli Ben-Hamu, samuel cohen, Joey Bose, Brandon Amos, Aditya Grover, Maximilian Nickel, Ricky T. Q. Chen, Yaron Lipman

Continuous Normalizing Flows (CNFs) are a class of generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE).

Equivariant Polynomials for Graph Neural Networks

no code implementations22 Feb 2023 Omri Puny, Derek Lim, Bobak T. Kiani, Haggai Maron, Yaron Lipman

This paper introduces an alternative expressive power hierarchy based on the ability of GNNs to calculate equivariant polynomials of a certain degree.

Graph Learning

VisCo Grids: Surface Reconstruction with Viscosity and Coarea Grids

no code implementations25 Mar 2023 Albert Pumarola, Artsiom Sanakoyeu, Lior Yariv, Ali Thabet, Yaron Lipman

Surface reconstruction has been seeing a lot of progress lately by utilizing Implicit Neural Representations (INRs).

Inductive Bias Surface Reconstruction

Multisample Flow Matching: Straightening Flows with Minibatch Couplings

no code implementations28 Apr 2023 Aram-Alexandre Pooladian, Heli Ben-Hamu, Carles Domingo-Enrich, Brandon Amos, Yaron Lipman, Ricky T. Q. Chen

Simulation-free methods for training continuous-time generative models construct probability paths that go between noise distributions and individual data samples.

On Kinetic Optimal Probability Paths for Generative Models

no code implementations11 Jun 2023 Neta Shaul, Ricky T. Q. Chen, Maximilian Nickel, Matt Le, Yaron Lipman

We investigate Kinetic Optimal (KO) Gaussian paths and offer the following observations: (i) We show the KE takes a simplified form on the space of Gaussian paths, where the data is incorporated only through a single, one dimensional scalar function, called the \emph{data separation function}.

Bespoke Solvers for Generative Flow Models

no code implementations29 Oct 2023 Neta Shaul, Juan Perez, Ricky T. Q. Chen, Ali Thabet, Albert Pumarola, Yaron Lipman

For example, a Bespoke solver for a CIFAR10 model produces samples with Fr\'echet Inception Distance (FID) of 2. 73 with 10 NFE, and gets to 1% of the Ground Truth (GT) FID (2. 59) for this model with only 20 NFE.

Guided Flows for Generative Modeling and Decision Making

no code implementations22 Nov 2023 Qinqing Zheng, Matt Le, Neta Shaul, Yaron Lipman, Aditya Grover, Ricky T. Q. Chen

Classifier-free guidance is a key component for enhancing the performance of conditional generative models across diverse tasks.

Conditional Image Generation Decision Making +3

Mosaic-SDF for 3D Generative Models

no code implementations14 Dec 2023 Lior Yariv, Omri Puny, Natalia Neverova, Oran Gafni, Yaron Lipman

Current diffusion or flow-based generative models for 3D shapes divide to two: distilling pre-trained 2D image diffusion models, and training directly on 3D shapes.

3D Generation 3D Shape Representation +1

D-Flow: Differentiating through Flows for Controlled Generation

no code implementations21 Feb 2024 Heli Ben-Hamu, Omri Puny, Itai Gat, Brian Karrer, Uriel Singer, Yaron Lipman

Taming the generation outcome of state of the art Diffusion and Flow-Matching (FM) models without having to re-train a task-specific model unlocks a powerful tool for solving inverse problems, conditional generation, and controlled generation in general.

Bespoke Non-Stationary Solvers for Fast Sampling of Diffusion and Flow Models

no code implementations2 Mar 2024 Neta Shaul, Uriel Singer, Ricky T. Q. Chen, Matthew Le, Ali Thabet, Albert Pumarola, Yaron Lipman

This paper introduces Bespoke Non-Stationary (BNS) Solvers, a solver distillation approach to improve sample efficiency of Diffusion and Flow models.

Audio Generation Conditional Image Generation +1

Cannot find the paper you are looking for? You can Submit a new open access paper.