Search Results for author: Heli Ben-Hamu

Found 10 papers, 7 papers with code

D-Flow: Differentiating through Flows for Controlled Generation

2 code implementations21 Feb 2024 Heli Ben-Hamu, Omri Puny, Itai Gat, Brian Karrer, Uriel Singer, Yaron Lipman

Taming the generation outcome of state of the art Diffusion and Flow-Matching (FM) models without having to re-train a task-specific model unlocks a powerful tool for solving inverse problems, conditional generation, and controlled generation in general.

Multisample Flow Matching: Straightening Flows with Minibatch Couplings

no code implementations28 Apr 2023 Aram-Alexandre Pooladian, Heli Ben-Hamu, Carles Domingo-Enrich, Brandon Amos, Yaron Lipman, Ricky T. Q. Chen

Simulation-free methods for training continuous-time generative models construct probability paths that go between noise distributions and individual data samples.

Flow Matching for Generative Modeling

2 code implementations6 Oct 2022 Yaron Lipman, Ricky T. Q. Chen, Heli Ben-Hamu, Maximilian Nickel, Matt Le

These paths are more efficient than diffusion paths, provide faster training and sampling, and result in better generalization.

Density Estimation

Matching Normalizing Flows and Probability Paths on Manifolds

no code implementations11 Jul 2022 Heli Ben-Hamu, samuel cohen, Joey Bose, Brandon Amos, Aditya Grover, Maximilian Nickel, Ricky T. Q. Chen, Yaron Lipman

Continuous Normalizing Flows (CNFs) are a class of generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE).

Global Attention Improves Graph Networks Generalization

3 code implementations14 Jun 2020 Omri Puny, Heli Ben-Hamu, Yaron Lipman

This paper advocates incorporating a Low-Rank Global Attention (LRGA) module, a computation and memory efficient variant of the dot-product attention (Vaswani et al., 2017), to Graph Neural Networks (GNNs) for improving their generalization power.

Graph Neural Network

Provably Powerful Graph Networks

2 code implementations NeurIPS 2019 Haggai Maron, Heli Ben-Hamu, Hadar Serviansky, Yaron Lipman

It was shown that the popular message passing GNN cannot distinguish between graphs that are indistinguishable by the 1-WL test (Morris et al. 2018; Xu et al. 2019).

Graph Classification Graph Learning +1

Surface Networks via General Covers

1 code implementation ICCV 2019 Niv Haim, Nimrod Segol, Heli Ben-Hamu, Haggai Maron, Yaron Lipman

Specifically, for the use case of learning spherical signals, our representation provides a low distortion alternative to several popular spherical parameterizations used in deep learning.

Retrieval

Invariant and Equivariant Graph Networks

no code implementations ICLR 2019 Haggai Maron, Heli Ben-Hamu, Nadav Shamir, Yaron Lipman

In this paper we provide a characterization of all permutation invariant and equivariant linear layers for (hyper-)graph data, and show that their dimension, in case of edge-value graph data, is 2 and 15, respectively.

2k

Multi-chart Generative Surface Modeling

1 code implementation6 Jun 2018 Heli Ben-Hamu, Haggai Maron, Itay Kezurer, Gal Avineri, Yaron Lipman

The new tensor data representation is used as input to Generative Adversarial Networks for the task of 3D shape generation.

3D Shape Generation Translation

Cannot find the paper you are looking for? You can Submit a new open access paper.