Search Results for author: Ivan Dokmanić

Found 42 papers, 31 papers with code

Joint Graph Rewiring and Feature Denoising via Spectral Resonance

1 code implementation13 Aug 2024 Jonas Linkerhägner, Cheng Shi, Ivan Dokmanić

In graph learning the graph and the node features both contain noisy information about the node labels.

Denoising Graph Learning +1

A spring-block theory of feature learning in deep neural networks

1 code implementation28 Jul 2024 Cheng Shi, Liming Pan, Ivan Dokmanić

A central question in deep learning is how deep neural networks (DNNs) learn features.

Ice-Tide: Implicit Cryo-ET Imaging and Deformation Estimation

1 code implementation4 Mar 2024 Valentin Debarnot, Vinith Kishore, Ricardo D. Righetto, Ivan Dokmanić

We introduce ICE-TIDE, a method for cryogenic electron tomography (cryo-ET) that simultaneously aligns observations and reconstructs a high-resolution volume.

Cryogenic Electron Tomography Electron Tomography

GLIMPSE: Generalized Local Imaging with MLPs

1 code implementation1 Jan 2024 AmirEhsan Khorashadizadeh, Valentin Debarnot, Tianlin Liu, Ivan Dokmanić

Deep learning is the current de facto state of the art in tomographic imaging.

Implicit Reconstructions from Deformed Projections for CryoET

no code implementations25 Jul 2023 Vinith Kishore, Valentin Debarnot, Ivan Dokmanić

Cryo-electron tomography (cryoET) is a technique that captures images of biological samples at different tilts, preserving their native state as much as possible.

Cryogenic Electron Tomography Electron Tomography

High-Rate Phase Association with Travel Time Neural Fields

1 code implementation14 Jul 2023 Cheng Shi, Maarten V. de Hoop, Ivan Dokmanić

Existing techniques relying on coarsely approximated, fixed wave speed models fail in this unexplored dense regime where the complexity of unknown wave speed cannot be ignored.

A Graph Dynamics Prior for Relational Inference

1 code implementation9 Jun 2023 Liming Pan, Cheng Shi, Ivan Dokmanić

In this work, we propose a \textit{graph dynamics prior} (GDP) for relational inference.

Graph Learning Graph Neural Network

An Approximation Theory for Metric Space-Valued Functions With A View Towards Deep Learning

no code implementations24 Apr 2023 Anastasis Kratsios, Chong Liu, Matti Lassas, Maarten V. de Hoop, Ivan Dokmanić

Motivated by the developing mathematics of deep learning, we build universal functions approximators of continuous maps between arbitrary Polish metric spaces $\mathcal{X}$ and $\mathcal{Y}$ using elementary functions between Euclidean spaces as building blocks.

Injectivity of ReLU networks: perspectives from statistical physics

1 code implementation27 Feb 2023 Antoine Maillard, Afonso S. Bandeira, David Belius, Ivan Dokmanić, Shuta Nakajima

Recent work connects this problem to spherical integral geometry giving rise to a conjectured sharp injectivity threshold for $\alpha = \frac{m}{n}$ by studying the expected Euler characteristic of a certain random set.

Deep Injective Prior for Inverse Scattering

1 code implementation8 Jan 2023 AmirEhsan Khorashadizadeh, Vahid Khorashadizadeh, Sepehr Eskandari, Guy A. E. Vandenbosch, Ivan Dokmanić

Unlike supervised methods that necessitate both scattered fields and target permittivities, our method only requires the target permittivities for training; it can then be used with any experimental setup.

Uncertainty Quantification

Homophily modulates double descent generalization in graph convolution networks

1 code implementation26 Dec 2022 Cheng Shi, Liming Pan, Hong Hu, Ivan Dokmanić

Motivated by experimental observations of ``transductive'' double descent in key networks and datasets, we use analytical tools from statistical physics and random matrix theory to precisely characterize generalization in simple graph convolution networks on the contextual stochastic block model.

Graph Learning Learning Theory +1

FunkNN: Neural Interpolation for Functional Generation

1 code implementation20 Dec 2022 AmirEhsan Khorashadizadeh, Anadi Chaman, Valentin Debarnot, Ivan Dokmanić

Our answer is FunkNN -- a new convolutional network which learns how to reconstruct continuous images at arbitrary coordinates and can be applied to any image dataset.

Image Generation

Deep Variational Inverse Scattering

1 code implementation8 Dec 2022 AmirEhsan Khorashadizadeh, Ali Aghababaei, Tin Vlašić, Hieu Nguyen, Ivan Dokmanić

Inverse medium scattering solvers generally reconstruct a single solution without an associated measure of uncertainty.

Differentiable Uncalibrated Imaging

1 code implementation18 Nov 2022 Sidharth Gupta, Konik Kothari, Valentin Debarnot, Ivan Dokmanić

We propose a differentiable imaging framework to address uncertainty in measurement coordinates such as sensor locations and projection angles.

Image Reconstruction

Small Transformers Compute Universal Metric Embeddings

1 code implementation NeurIPS 2023 Anastasis Kratsios, Valentin Debarnot, Ivan Dokmanić

We derive embedding guarantees for feature maps implemented by small neural networks called \emph{probabilistic transformers}.

Memorization

Manifold Rewiring for Unlabeled Imaging

no code implementations12 Sep 2022 Valentin Debarnot, Vinith Kishore, Cheng Shi, Ivan Dokmanić

We illustrate our graph denoising framework on regular synthetic graphs and then apply it to single-particle cryo-EM where the measurements are corrupted by very high levels of noise.

Denoising Link Prediction

Orthogonal Matrix Retrieval with Spatial Consensus for 3D Unknown-View Tomography

1 code implementation6 Jul 2022 Shuai Huang, Mona Zehni, Ivan Dokmanić, Zhizhen Zhao

Unknown-view tomography (UVT) reconstructs a 3D density map from its 2D projections at unknown, random orientations.

Retrieval

Implicit Neural Representation for Mesh-Free Inverse Obstacle Scattering

no code implementations4 Jun 2022 Tin Vlašić, Hieu Nguyen, AmirEhsan Khorashadizadeh, Ivan Dokmanić

In this paper, we introduce an implicit neural representation-based framework for solving the inverse obstacle scattering problem in a mesh-free fashion.

Neural Link Prediction with Walk Pooling

1 code implementation ICLR 2022 Liming Pan, Cheng Shi, Ivan Dokmanić

Instead of extracting transition probabilities from the original graph, it computes the transition matrix of a "predictive" latent graph by applying attention to learned features; this may be interpreted as feature-sensitive topology fingerprinting.

Link Prediction

Universal Joint Approximation of Manifolds and Densities by Simple Injective Flows

no code implementations8 Oct 2021 Michael Puthawala, Matti Lassas, Ivan Dokmanić, Maarten de Hoop

We show that in general, injective flows between $\mathbb{R}^n$ and $\mathbb{R}^m$ universally approximate measures supported on images of extendable embeddings, which are a subset of standard embeddings: when the embedding dimension m is small, topological obstructions may preclude certain manifolds as admissible targets.

Universal Approximation Under Constraints is Possible with Transformers

no code implementations ICLR 2022 Anastasis Kratsios, Behnoosh Zamanlooy, Tianlin Liu, Ivan Dokmanić

Many practical problems need the output of a machine learning model to satisfy a set of constraints, $K$.

Localizing Unsynchronized Sensors with Unknown Sources

1 code implementation6 Feb 2021 Dalia El Badawy, Viktor Larsson, Marc Pollefeys, Ivan Dokmanić

We look at the general case where neither the emission times of the sources nor the reference time frames of the receivers are known.

Total Least Squares Phase Retrieval

1 code implementation1 Feb 2021 Sidharth Gupta, Ivan Dokmanić

We address the phase retrieval problem with errors in the sensing vectors.

Retrieval

Truly shift-invariant convolutional neural networks

2 code implementations CVPR 2021 Anadi Chaman, Ivan Dokmanić

Thanks to the use of convolution and pooling layers, convolutional neural networks were for a long time thought to be shift-invariant.

Data Augmentation

Geometry of Similarity Comparisons

no code implementations17 Jun 2020 Puoya Tabaghi, Jianhao Peng, Olgica Milenkovic, Ivan Dokmanić

To study this question, we introduce the notions of the \textit{ordinal capacity} of a target space form and \emph{ordinal spread} of the similarity measurements.

Globally Injective ReLU Networks

no code implementations15 Jun 2020 Michael Puthawala, Konik Kothari, Matti Lassas, Ivan Dokmanić, Maarten de Hoop

Injectivity plays an important role in generative models where it enables inference; in inverse problems and compressed sensing with generative priors it is a precursor to well posedness.

Learning the geometry of wave-based imaging

1 code implementation NeurIPS 2020 Konik Kothari, Maarten de Hoop, Ivan Dokmanić

We propose a general physics-based deep learning architecture for wave-based imaging problems.

Inductive Bias Position

Hyperbolic Distance Matrices

no code implementations18 May 2020 Puoya Tabaghi, Ivan Dokmanić

Hyperbolic space is a natural setting for mining and visualizing data with hierarchical structure.

Fast Optical System Identification by Numerical Interferometry

1 code implementation4 Nov 2019 Sidharth Gupta, Rémi Gribonval, Laurent Daudet, Ivan Dokmanić

Our method simplifies the calibration of optical transmission matrices from a quadratic to a linear inverse problem by first recovering the phase of the measurements.

Retrieval

The fastest $\ell_{1,\infty}$ prox in the west

1 code implementation9 Oct 2019 Benjamín Béjar, Ivan Dokmanić, René Vidal

In this paper we study the proximal operator of the mixed $\ell_{1,\infty}$ matrix norm and show that it can be computed in closed form by applying the well-known soft-thresholding operator to each column of the matrix.

Don't take it lightly: Phasing optical random projections with unknown operators

1 code implementation NeurIPS 2019 Sidharth Gupta, Rémi Gribonval, Laurent Daudet, Ivan Dokmanić

A signal of interest $\mathbf{\xi} \in \mathbb{R}^N$ is mixed by a random scattering medium to compute the projection $\mathbf{y} = \mathbf{A} \mathbf{\xi}$, with $\mathbf{A} \in \mathbb{C}^{M \times N}$ being a realization of a standard complex Gaussian iid random matrix.

Quantization Retrieval

Solving Complex Quadratic Systems with Full-Rank Random Matrices

1 code implementation14 Feb 2019 Shuai Huang, Sidharth Gupta, Ivan Dokmanić

We tackle the problem of recovering a complex signal $\boldsymbol x\in\mathbb{C}^n$ from quadratic measurements of the form $y_i=\boldsymbol x^*\boldsymbol A_i\boldsymbol x$, where $\boldsymbol A_i$ is a full-rank, complex random measurement matrix whose entries are generated from a rotation-invariant sub-Gaussian distribution.

Information Theory Information Theory

Learning Schatten--von Neumann Operators

no code implementations29 Jan 2019 Puoya Tabaghi, Maarten de Hoop, Ivan Dokmanić

We study the learnability of a class of compact operators known as Schatten--von Neumann operators.

Learning Theory

Geometric Invariants for Sparse Unknown View Tomography

1 code implementation25 Nov 2018 Mona Zehni, Shuai Huang, Ivan Dokmanić, Zhizhen Zhao

For a point source model, we show that these features reveal geometric information about the model such as the radial and pairwise distances.

Random mesh projectors for inverse problems

1 code implementation ICLR 2019 Sidharth Gupta, Konik Kothari, Maarten V. de Hoop, Ivan Dokmanić

We show that in this case the common approach to directly learn the mapping from the measured data to the reconstruction becomes unstable.

Reconstructing Point Sets from Distance Distributions

1 code implementation6 Apr 2018 Shuai Huang, Ivan Dokmanić

Our method is the first practical approach to solve the large-scale noisy beltway problem where the points lie on a loop.

Pyroomacoustics: A Python package for audio room simulations and array processing algorithms

2 code implementations11 Oct 2017 Robin Scheibler, Eric Bezzam, Ivan Dokmanić

We present pyroomacoustics, a software package aimed at the rapid development and testing of audio array processing algorithms.

Sound Audio and Speech Processing

Inverse Problems with Invariant Multiscale Statistics

no code implementations18 Sep 2016 Ivan Dokmanić, Joan Bruna, Stéphane Mallat, Maarten de Hoop

We propose a new approach to linear ill-posed inverse problems.

Computational Engineering, Finance, and Science

Cannot find the paper you are looking for? You can Submit a new open access paper.