Search Results for author: Nadav Dym

Found 13 papers, 1 papers with code

Equivariant Frames and the Impossibility of Continuous Canonicalization

no code implementations25 Feb 2024 Nadav Dym, Hannah Lawrence, Jonathan W. Siegel

Canonicalization provides an architecture-agnostic method for enforcing equivariance, with generalizations such as frame-averaging recently gaining prominence as a lightweight and flexible alternative to equivariant architectures.

Weisfeiler Leman for Euclidean Equivariant Machine Learning

no code implementations4 Feb 2024 Snir Hordan, Tal Amir, Nadav Dym

Finally, we show that a simple modification of this PPGN architecture can be used to obtain a universal equivariant architecture that can approximate all continuous equivariant functions uniformly.

Future Directions in Foundations of Graph Machine Learning

no code implementations3 Feb 2024 Christopher Morris, Nadav Dym, Haggai Maron, İsmail İlkan Ceylan, Fabrizio Frasca, Ron Levie, Derek Lim, Michael Bronstein, Martin Grohe, Stefanie Jegelka

Machine learning on graphs, especially using graph neural networks (GNNs), has seen a surge in interest due to the wide availability of graph data across a broad spectrum of disciplines, from life to social and engineering sciences.

Position

Phase retrieval with semi-algebraic and ReLU neural network priors

no code implementations15 Nov 2023 Tamir Bendory, Nadav Dym, Dan Edidin, Arun Suresh

In this paper, we study the phase retrieval problem under the prior that the signal lies in a semi-algebraic set.

Retrieval

Equivariant Deep Weight Space Alignment

no code implementations20 Oct 2023 Aviv Navon, Aviv Shamsian, Ethan Fetaya, Gal Chechik, Nadav Dym, Haggai Maron

To accelerate the alignment process and improve its quality, we propose a novel framework aimed at learning to solve the weight alignment problem, which we name Deep-Align.

Complete Neural Networks for Euclidean Graphs

no code implementations31 Jan 2023 Snir Hordan, Tal Amir, Steven J. Gortler, Nadav Dym

We propose a 2-WL-like geometric graph isomorphism test and prove it is complete when applied to Euclidean Graphs in $\mathbb{R}^3$.

Property Prediction

Symmetrized Robust Procrustes: Constant-Factor Approximation and Exact Recovery

no code implementations18 Jul 2022 Tal Amir, Shahar Kovalsky, Nadav Dym

Our relaxation enjoys several theoretical and practical advantages: Theoretically, we prove that our method provides a $\sqrt{2}$-factor approximation to the Robust Procrustes problem, and that, under appropriate assumptions, it exactly recovers the true rigid motion from point correspondences contaminated by outliers.

Translation Word Translation

Low Dimensional Invariant Embeddings for Universal Geometric Learning

no code implementations5 May 2022 Nadav Dym, Steven J. Gortler

We show that when a continuous family of semi-algebraic separating invariants is available, separation can be obtained by randomly selecting $2D+1 $ of these invariants.

A Simple and Universal Rotation Equivariant Point-cloud Network

1 code implementation2 Mar 2022 Ben Finkelshtein, Chaim Baskin, Haggai Maron, Nadav Dym

Equivariance to permutations and rigid motions is an important inductive bias for various 3D learning problems.

Inductive Bias

Neural Network Approximation of Refinable Functions

no code implementations28 Jul 2021 Ingrid Daubechies, Ronald DeVore, Nadav Dym, Shira Faigenbaum-Golovin, Shahar Z. Kovalsky, Kung-Ching Lin, Josiah Park, Guergana Petrova, Barak Sober

Namely, we show that refinable functions are approximated by the outputs of deep ReLU networks with a fixed width and increasing depth with accuracy exponential in terms of their number of parameters.

On the Universality of Rotation Equivariant Point Cloud Networks

no code implementations ICLR 2021 Nadav Dym, Haggai Maron

We first derive two sufficient conditions for an equivariant architecture to have the universal approximation property, based on a novel characterization of the space of equivariant polynomials.

Translation

Expression of Fractals Through Neural Network Functions

no code implementations27 May 2019 Nadav Dym, Barak Sober, Ingrid Daubechies

The combination of this phenomenon with the capacity, demonstrated here, of DNNs to efficiently approximate IFS may contribute to the success of DNNs, particularly striking for image processing tasks, as well as suggest new algorithms for representing self similarities in images based on the DNN mechanism.

Linearly Converging Quasi Branch and Bound Algorithms for Global Rigid Registration

no code implementations ICCV 2019 Nadav Dym, Shahar Ziv Kovalsky

In recent years, several branch-and-bound (BnB) algorithms have been proposed to globally optimize rigid registration problems.

Cannot find the paper you are looking for? You can Submit a new open access paper.