Search Results for author: Eldad Haber

Found 31 papers, 12 papers with code

Every Node Counts: Improving the Training of Graph Neural Networks on Node Classification

no code implementations29 Nov 2022 Moshe Eliasof, Eldad Haber, Eran Treister

In this paper, we propose novel objective terms for the training of GNNs for node classification, aiming to exploit all the available data and improve accuracy.

Node Classification

Neural DAEs: Constrained neural networks

no code implementations25 Nov 2022 Tue Boesen, Eldad Haber, Uri M. Ascher

In this article we investigate the effect of explicitly adding auxiliary trajectory information to neural networks for dynamical systems.

Estimating a potential without the agony of the partition function

no code implementations19 Aug 2022 Eldad Haber, Moshe Eliasof, Luis Tenorio

In this paper we propose an alternative approach based on Maximum A-Posteriori (MAP) estimators, we name Maximum Recovery MAP (MR-MAP), to derive estimators that do not require the computation of the partition function, and reformulate the problem as an optimization problem.

pathGCN: Learning General Graph Spatial Operators from Paths

no code implementations15 Jul 2022 Moshe Eliasof, Eldad Haber, Eran Treister

In the context of GCNs, differently from CNNs, a pre-determined spatial operator based on the graph Laplacian is often chosen, allowing only the point-wise operations to be learnt.

A-Optimal Active Learning

2 code implementations18 Oct 2021 Tue Boesen, Eldad Haber

The first is based on a Bayesian interpretation of the semi-supervised learning problem with the graph Laplacian that is used for the prior distribution and the second is based on a frequentist approach, that updates the estimation of the bias term based on the recovery of the labels.

Active Learning Experimental Design

PDE-GCN: Novel Architectures for Graph Neural Networks Motivated by Partial Differential Equations

1 code implementation NeurIPS 2021 Moshe Eliasof, Eldad Haber, Eran Treister

Moreover, as we demonstrate using an extensive set of experiments, our PDE-motivated networks can generalize and be effective for various types of problems from different fields.

An Introduction to Deep Generative Modeling

1 code implementation9 Mar 2021 Lars Ruthotto, Eldad Haber

Developing DGMs has become one of the most hotly researched fields in artificial intelligence in recent years.

Mimetic Neural Networks: A unified framework for Protein Design and Folding

no code implementations7 Feb 2021 Moshe Eliasof, Tue Boesen, Eldad Haber, Chen Keasar, Eran Treister

Recent advancements in machine learning techniques for protein folding motivate better results in its inverse problem -- protein design.

BIG-bench Machine Learning Protein Folding

Segmentation of Pulmonary Opacification in Chest CT Scans of COVID-19 Patients

1 code implementation7 Jul 2020 Keegan Lensink, Issam Laradji, Marco Law, Paolo Emilio Barbano, Savvas Nicolaou, William Parker, Eldad Haber

In this work we provide open source models for the segmentation of patterns of pulmonary opacification on chest Computed Tomography (CT) scans which have been correlated with various stages and severities of infection.

Computed Tomography (CT) Domain Adaptation

Fully reversible neural networks for large-scale surface and sub-surface characterization via remote sensing

no code implementations16 Mar 2020 Bas Peters, Eldad Haber, Keegan Lensink

The large spatial/frequency scale of hyperspectral and airborne magnetic and gravitational data causes memory issues when using convolutional neural networks for (sub-) surface characterization.

Change Detection

Symmetric block-low-rank layers for fully reversible multilevel neural networks

no code implementations14 Dec 2019 Bas Peters, Eldad Haber, Keegan Lensink

Factors that limit the size of the input and output of a neural network include memory requirements for the network states/activations to compute gradients, as well as memory for the convolutional kernels or other weights.

Video Segmentation Video Semantic Segmentation

LeanConvNets: Low-cost Yet Effective Convolutional Neural Networks

no code implementations29 Oct 2019 Jonathan Ephrath, Moshe Eliasof, Lars Ruthotto, Eldad Haber, Eran Treister

In practice, the input data and the hidden features consist of a large number of channels, which in most CNNs are fully coupled by the convolution operators.

Image Classification Semantic Segmentation +2

Fluid Flow Mass Transport for Generative Networks

no code implementations3 Oct 2019 Jingrong Lin, Keegan Lensink, Eldad Haber

Generative Adversarial Networks have been shown to be powerful in generating content.

Fully Hyperbolic Convolutional Neural Networks

no code implementations24 May 2019 Keegan Lensink, Bas Peters, Eldad Haber

However, their application to problems with high dimensional input and output, such as high-resolution image and video segmentation or 3D medical imaging, has been limited by various factors.

Depth Estimation General Classification +5

LeanResNet: A Low-cost Yet Effective Convolutional Residual Networks

no code implementations15 Apr 2019 Jonathan Ephrath, Lars Ruthotto, Eldad Haber, Eran Treister

Convolutional Neural Networks (CNNs) filter the input data using spatial convolution operators with compact stencils.

General Classification Image Classification

Neural-networks for geophysicists and their application to seismic data interpretation

no code implementations27 Mar 2019 Bas Peters, Eldad Haber, Justin Granek

Neural-networks have seen a surge of interest for the interpretation of seismic images during the last few years.

IMEXnet: A Forward Stable Deep Neural Network

1 code implementation6 Mar 2019 Eldad Haber, Keegan Lensink, Eran Treister, Lars Ruthotto

Deep convolutional neural networks have revolutionized many machine learning and computer vision tasks, however, some remaining key challenges limit their wider use.

Semantic Segmentation

ADMM-SOFTMAX : An ADMM Approach for Multinomial Logistic Regression

1 code implementation27 Jan 2019 Samy Wu Fung, Sanna Tyrväinen, Lars Ruthotto, Eldad Haber

Solution of the least-squares problem can be be accelerated by pre-computing a factorization or preconditioner, and the separability in the smooth, convex problem can be easily parallelized across examples.

General Classification Image Classification +2

Automatic classification of geologic units in seismic images using partially interpreted examples

no code implementations12 Jan 2019 Bas Peters, Justin Granek, Eldad Haber

Tests on seismic images and interpretation information from the Sea of Ireland show that we obtain high-quality predicted interpretations from a small number of large seismic images.

General Classification Seismic Interpretation +1

Multi-resolution neural networks for tracking seismic horizons from few training images

no code implementations26 Dec 2018 Bas Peters, Justin Granek, Eldad Haber

Our networks learn from a small number of large seismic images without creating patches.

GlymphVIS: Visualizing Glymphatic Transport Pathways Using Regularized Optimal Transport

no code implementations24 Aug 2018 Rena Elkin, Saad Nadeem, Eldad Haber, Klara Steklova, Hedok Lee, Helene Benveniste, Allen Tannenbaum

The glymphatic system (GS) is a transit passage that facilitates brain metabolic waste removal and its dysfunction has been associated with neurodegenerative diseases such as Alzheimer's disease.

Never look back - A modified EnKF method and its application to the training of neural networks without back propagation

no code implementations21 May 2018 Eldad Haber, Felix Lucka, Lars Ruthotto

Further, we provide numerical examples that demonstrate the potential of our method for training deep neural networks.

Deep Neural Networks Motivated by Partial Differential Equations

1 code implementation12 Apr 2018 Lars Ruthotto, Eldad Haber

In the latter area, PDE-based approaches interpret image data as discretizations of multivariate functions and the output of image processing algorithms as solutions to certain PDEs.

Denoising Image Classification +2

Multi-level Residual Networks from Dynamical Systems View

no code implementations ICLR 2018 Bo Chang, Lili Meng, Eldad Haber, Frederick Tung, David Begert

Deep residual networks (ResNets) and their variants are widely used in many computer vision applications and natural language processing tasks.

General Classification Image Classification

Reversible Architectures for Arbitrarily Deep Residual Neural Networks

2 code implementations12 Sep 2017 Bo Chang, Lili Meng, Eldad Haber, Lars Ruthotto, David Begert, Elliot Holtham

In this work, we interpret deep residual networks as ordinary differential equations (ODEs), which have long been studied in mathematics and physics with rich theoretical and empirical success.

Image Classification

A numerical method for efficient 3D inversions using Richards equation

no code implementations11 Jun 2017 Rowan Cockett, Lindsey J. Heagy, Eldad Haber

Fluid flow in the vadose zone is governed by Richards equation; it is parameterized by hydraulic conductivity, which is a nonlinear function of pressure head.


Stable Architectures for Deep Neural Networks

4 code implementations9 May 2017 Eldad Haber, Lars Ruthotto

While our new architectures restrict the solution space, several numerical experiments show their competitiveness with state-of-the-art networks.

Learning across scales - A multiscale method for Convolution Neural Networks

1 code implementation6 Mar 2017 Eldad Haber, Lars Ruthotto, Elliot Holtham, Seong-Hwan Jun

In this work we establish the relation between optimal control and training deep Convolution Neural Networks (CNNs).

jInv -- a flexible Julia package for PDE parameter estimation

3 code implementations23 Jun 2016 Lars Ruthotto, Eran Treister, Eldad Haber

Estimating parameters of Partial Differential Equations (PDEs) from noisy and indirect measurements often requires solving ill-posed inverse problems.

Mathematical Software

Cannot find the paper you are looking for? You can Submit a new open access paper.