no code implementations • 12 Aug 2024 • Holger Boche, Vit Fojtik, Adalbert Fono, Gitta Kutyniok

The unwavering success of deep learning in the past decade led to the increasing prevalence of deep learning methods in various application fields.

no code implementations • 4 Apr 2024 • Sohir Maskey, Gitta Kutyniok, Ron Levie

In this more realistic and challenging scenario, we provide a generalization bound that decreases as the average number of nodes in the graphs increases.

1 code implementation • 20 Mar 2024 • Raffaele Paolino, Sohir Maskey, Pascal Welke, Gitta Kutyniok

We introduce $r$-loopy Weisfeiler-Leman ($r$-$\ell{}$WL), a novel hierarchy of graph isomorphism tests and a corresponding GNN framework, $r$-$\ell{}$MPNN, that can count cycles up to length $r + 2$.

no code implementations • 11 Feb 2024 • Beatrice Lorenz, Aras Bacho, Gitta Kutyniok

This paper provides rigorous error bounds for physics-informed neural networks approximating the semilinear wave equation.

no code implementations • 18 Jan 2024 • Holger Boche, Adalbert Fono, Gitta Kutyniok

Motivated by the observation that the current evolution of deep learning models necessitates a change in computing technology, we derive a mathematical framework which enables us to analyze whether a transparent implementation in a computing model is feasible.

no code implementations • 16 Dec 2023 • Stefan Kolek, Aditya Chattopadhyay, Kwan Ho Ryan Chan, Hector Andrade-Loarca, Gitta Kutyniok, Réne Vidal

To solve the optimization problem, we propose a new query dictionary learning algorithm inspired by classical sparse dictionary learning.

1 code implementation • 25 Oct 2023 • Gabriel Mukobi, Peter Chatain, Su Fong, Robert Windesheim, Gitta Kutyniok, Kush Bhatia, Silas Alberti

Here, we focus on two prevalent methods used to align these models, Supervised Fine-Tuning (SFT) and Reinforcement Learning from Human Feedback (RLHF).

no code implementations • 25 Oct 2023 • Philipp Scholl, Maged Iskandar, Sebastian Wolf, Jinoh Lee, Aras Bacho, Alexander Dietrich, Alin Albu-Schäffer, Gitta Kutyniok

Subsequently, to adapt to more complex asymmetric settings, we train a second network on a small dataset, focusing on predicting the residual of the initial network's output.

no code implementations • 11 Oct 2023 • Çağkan Yapar, Fabian Jaensch, Ron Levie, Gitta Kutyniok, Giuseppe Caire

To foster research and facilitate fair comparisons among recently proposed pathloss radio map prediction methods, we have launched the ICASSP 2023 First Pathloss Radio Map Prediction Challenge.

1 code implementation • 9 Oct 2023 • Philipp Scholl, Katharina Bieker, Hillary Hauger, Gitta Kutyniok

The problem of symbolic regression (SR) arises in many different applications, such as identifying physical laws or deriving mathematical equations describing the behavior of financial markets from given data.

no code implementations • 16 Aug 2023 • Manjot Singh, Adalbert Fono, Gitta Kutyniok

The synergy between spiking neural networks and neuromorphic hardware holds promise for the development of energy-efficient AI applications.

1 code implementation • 3 Aug 2023 • Hector Andrade-Loarca, Julius Hege, Daniel Cremers, Gitta Kutyniok

Overall, the neural Poisson surface reconstruction not only improves upon the limitations of classical deep neural networks in shape reconstruction but also achieves superior results in terms of reconstruction quality, running time, and resolution agnosticism.

no code implementations • 5 Jul 2023 • Silas Alberti, Niclas Dern, Laura Thesing, Gitta Kutyniok

Natural language processing (NLP) made an impressive jump with the introduction of Transformers.

no code implementations • 3 Jul 2023 • Aras Bacho, Holger Boche, Gitta Kutyniok

The cause of these computability problems is rooted in the fact that digital hardware is based on the computing model of the Turing machine, which is inherently discrete.

no code implementations • 15 Jun 2023 • Niklas Breustedt, Paolo Climaco, Jochen Garcke, Jan Hamaekers, Gitta Kutyniok, Dirk A. Lorenz, Rick Oerder, Chirag Varun Shukla

However, learning on large datasets is strongly limited by the availability of computational resources and can be infeasible in some scenarios.

1 code implementation • NeurIPS 2023 • Sohir Maskey, Raffaele Paolino, Aras Bacho, Gitta Kutyniok

In this paper, we generalize the concept of oversmoothing from undirected to directed graphs.

no code implementations • 26 Jan 2023 • Christian Koke, Gitta Kutyniok

This work develops a flexible and mathematically sound framework for the design and analysis of graph scattering networks with variable branching ratios and generic functional calculus filters.

no code implementations • 15 Jan 2023 • Yunseok Lee, Holger Boche, Gitta Kutyniok

Optimization problems are a staple of today's scientific and technical landscape.

1 code implementation • CVPR 2023 • Stefan Kolek, Robert Windesheim, Hector Andrade Loarca, Gitta Kutyniok, Ron Levie

However, the smoothness of a mask limits its ability to separate fine-detail patterns, that are relevant for the classifier, from nearby nuisance patterns, that do not affect the classifier.

1 code implementation • 18 Nov 2022 • Çağkan Yapar, Ron Levie, Gitta Kutyniok, Giuseppe Caire

In this article, we present a collection of radio map datasets in dense urban setting, which we generated and made publicly available.

1 code implementation • 15 Oct 2022 • Philipp Scholl, Aras Bacho, Holger Boche, Gitta Kutyniok

Finally, we provide extensive numerical experiments showing that our algorithms in combination with common approaches for learning physical laws indeed allow to guarantee that a unique governing differential equation is learnt, without assuming any knowledge about the function, thereby ensuring reliability.

no code implementations • 15 Oct 2022 • Raffaele Paolino, Aleksandar Bojchevski, Stephan Günnemann, Gitta Kutyniok, Ron Levie

A powerful framework for studying graphs is to consider them as geometric graphs: nodes are randomly sampled from an underlying metric space, and any pair of nodes is connected if their distance is less than a specified neighborhood radius.

1 code implementation • 11 Jun 2022 • Duc Anh Nguyen, Ron Levie, Julian Lienen, Gitta Kutyniok, Eyke Hüllermeier

The notion of neural collapse refers to several emergent phenomena that have been empirically observed across various canonical classification problems.

1 code implementation • 30 May 2022 • Yangze Zhou, Gitta Kutyniok, Bruno Ribeiro

This work provides the first theoretical study on the ability of graph Message Passing Neural Networks (gMPNNs) -- such as Graph Neural Networks (GNNs) -- to perform inductive out-of-distribution (OOD) link prediction tasks, where deployment (test) graph sizes are larger than training graphs.

no code implementations • 5 Apr 2022 • Holger Boche, Adalbert Fono, Gitta Kutyniok

For this, we focus on the class of inverse problems, which, in particular, encompasses any task to reconstruct data from measurements.

no code implementations • 16 Mar 2022 • Gitta Kutyniok

We currently witness the spectacular success of artificial intelligence in both science and public life.

no code implementations • 28 Feb 2022 • Holger Boche, Adalbert Fono, Gitta Kutyniok

Deep neural networks have seen tremendous success over the last years.

1 code implementation • 1 Feb 2022 • Çağkan Yapar, Ron Levie, Gitta Kutyniok, Giuseppe Caire

We present LocUNet: A deep learning method for localization, based merely on Received Signal Strength (RSS) from Base Stations (BSs), which does not require any increase in computation complexity at the user devices with respect to the device standard operations, unlike methods that rely on time of arrival or angle of arrival information.

no code implementations • 1 Feb 2022 • Sohir Maskey, Ron Levie, Yunseok Lee, Gitta Kutyniok

Message passing neural networks (MPNN) have seen a steep rise in popularity since their introduction as generalizations of convolutional neural networks to graph-structured data, and are now considered state-of-the-art tools for solving a large variety of graph-focused problems.

1 code implementation • 1 Feb 2022 • Mariia Seleznova, Gitta Kutyniok

We derive exact expressions for the NTK dispersion in the infinite-depth-and-width limit in all three phases and conclude that the NTK variability grows exponentially with depth at the EOC and in the chaotic phase but not in the ordered phase.

no code implementations • 12 Oct 2021 • Stefan Kolek, Duc Anh Nguyen, Ron Levie, Joan Bruna, Gitta Kutyniok

We present the Rate-Distortion Explanation (RDE) framework, a mathematically well-founded method for explaining black-box model decisions.

1 code implementation • 7 Oct 2021 • Stefan Kolek, Duc Anh Nguyen, Ron Levie, Joan Bruna, Gitta Kutyniok

We present CartoonX (Cartoon Explanation), a novel model-agnostic explanation method tailored towards image classifiers and based on the rate-distortion explanation (RDE) framework.

no code implementations • 21 Sep 2021 • Sohir Maskey, Ron Levie, Gitta Kutyniok

Our main contributions can be summarized as follows: 1) we prove that any fixed GCNN with continuous filters is transferable under graphs that approximate the same graphon, 2) we prove transferability for graphs that approximate unbounded graphon shift operators, which are defined in this paper, and, 3) we obtain non-asymptotic approximation results, proving linear stability of GCNNs.

no code implementations • 12 Aug 2021 • Héctor Andrade-Loarca, Gitta Kutyniok, Ozan Öktem, Philipp Petersen

We present a deep learning-based algorithm to jointly solve a reconstruction problem and a wavefront set extraction problem in tomographic imaging.

1 code implementation • 23 Jun 2021 • Çağkan Yapar, Ron Levie, Gitta Kutyniok, Giuseppe Caire

Global Navigation Satellite Systems typically perform poorly in urban environments, where the likelihood of line-of-sight conditions between devices and satellites is low.

no code implementations • 9 May 2021 • Julius Berner, Philipp Grohs, Gitta Kutyniok, Philipp Petersen

We describe the new field of mathematical analysis of deep learning.

no code implementations • 8 Dec 2020 • Mariia Seleznova, Gitta Kutyniok

We find out that whether a network is in the NTK regime depends on the hyperparameters of random initialization and the network's depth.

no code implementations • 9 Jul 2020 • Ingo Gühring, Mones Raslan, Gitta Kutyniok

In this review paper, we give a comprehensive overview of the large variety of approximation results for neural networks.

no code implementations • 1 Jul 2020 • Cosmas Heiß, Ron Levie, Cinjon Resnick, Gitta Kutyniok, Joan Bruna

It is widely recognized that the predictions of deep neural networks are difficult to parse relative to simpler approaches.

no code implementations • 1 Jul 2020 • Alex Goeßmann, Gitta Kutyniok

In case of the NeuRIP event, we then provide bounds on the expected risk, which hold for networks in any sublevel set of the empirical risk.

no code implementations • 9 Jun 2020 • Çağkan Yapar, Ron Levie, Gitta Kutyniok, Giuseppe Caire

Using the approximations of the pathloss functions of all base stations and the reported signal strengths, we are able to extract a very accurate approximation of the location of the user.

1 code implementation • 25 Apr 2020 • Moritz Geist, Philipp Petersen, Mones Raslan, Reinhold Schneider, Gitta Kutyniok

Here, approximation theory predicts that the performance of the model should depend only very mildly on the dimension of the parameter space and is determined by the intrinsic dimension of the solution manifold of the parametric partial differential equation.

1 code implementation • 25 Mar 2020 • Luis Oala, Cosmas Heiß, Jan Macdonald, Maximilian März, Wojciech Samek, Gitta Kutyniok

We propose a fast, non-Bayesian method for producing uncertainty scores in the output of pre-trained deep neural networks (DNNs) using a data-driven interval propagating network.

1 code implementation • 27 Nov 2019 • Héctor Andrade-Loarca, Gitta Kutyniok, Ozan Öktem

This is based on the fact that edges in images contain most of the semantic information.

1 code implementation • 17 Nov 2019 • Ron Levie, Çağkan Yapar, Gitta Kutyniok, Giuseppe Caire

In this paper we propose a highly efficient and very accurate deep learning method for estimating the propagation pathloss from a point $x$ (transmitter location) to any point $y$ on a planar domain.

no code implementations • 30 Jul 2019 • Ron Levie, Wei Huang, Lorenzo Bucci, Michael M. Bronstein, Gitta Kutyniok

Transferability, which is a certain type of generalization capability, can be loosely defined as follows: if two graphs describe the same phenomenon, then a single filter or ConvNet should have similar repercussions on both graphs.

2 code implementations • 27 May 2019 • Jan Macdonald, Stephan Wäldchen, Sascha Hauch, Gitta Kutyniok

We formalise the widespread idea of interpreting neural network decisions as an explicit optimisation problem in a rate-distortion framework.

no code implementations • 3 May 2019 • Rémi Gribonval, Gitta Kutyniok, Morten Nielsen, Felix Voigtlaender

We study the expressivity of deep neural networks.

no code implementations • 31 Mar 2019 • Gitta Kutyniok, Philipp Petersen, Mones Raslan, Reinhold Schneider

We derive upper bounds on the complexity of ReLU neural networks approximating the solution maps of parametric partial differential equations.

no code implementations • 21 Feb 2019 • Ingo Gühring, Gitta Kutyniok, Philipp Petersen

We analyze approximation rates of deep ReLU neural networks for Sobolev-regular functions with respect to weaker Sobolev norms.

no code implementations • 29 Jan 2019 • Ron Levie, Elvin Isufi, Gitta Kutyniok

For filters in this space, the perturbation in the filter is bounded by a constant times the perturbation in the graph, and filters in the Cayley smoothness space are thus termed linearly stable.

no code implementations • 17 Jan 2019 • Dominik Alfke, Weston Baines, Jan Blechschmidt, Mauricio J. del Razo Sarmina, Amnon Drory, Dennis Elbrächter, Nando Farchmin, Matteo Gambara, Silke Glas, Philipp Grohs, Peter Hinz, Danijel Kivaranovic, Christian Kümmerle, Gitta Kutyniok, Sebastian Lunz, Jan Macdonald, Ryan Malthaner, Gregory Naisat, Ariel Neufeld, Philipp Christian Petersen, Rafael Reisenhofer, Jun-Da Sheng, Laura Thesing, Philipp Trunschke, Johannes von Lindheim, David Weber, Melanie Weber

We present a novel technique based on deep learning and set theory which yields exceptional classification and prediction results.

1 code implementation • 5 Jan 2019 • Héctor Andrade-Loarca, Gitta Kutyniok, Ozan Öktem, Philipp Petersen

Microlocal analysis provides deep insight into singularity structures and is often crucial for solving inverse problems, predominately, in imaging sciences.

no code implementations • 20 Aug 2018 • Martin Genzel, Gitta Kutyniok

We study the estimation capacity of the generalized Lasso, i. e., least squares minimization combined with a (convex) structural constraint.

no code implementations • 4 May 2017 • Helmut Bölcskei, Philipp Grohs, Gitta Kutyniok, Philipp Petersen

Specifically, all function classes that are optimally approximated by a general class of representation systems---so-called \emph{affine systems}---can be approximated by deep neural networks with minimal connectivity and memory requirements.

no code implementations • 1 May 2017 • Jackie Ma, Maximilian März, Stephanie Funk, Jeanette Schulz-Menger, Gitta Kutyniok, Tobias Schaeffter, Christoph Kolbitsch

High-resolution three-dimensional (3D) cardiovascular magnetic resonance (CMR) is a valuable medical imaging technique, but its widespread application in clinical practice is hampered by long acquisition times.

no code implementations • 31 Aug 2016 • Martin Genzel, Gitta Kutyniok

In this paper, we study the challenge of feature selection based on a relatively small collection of sample pairs $\{(x_i, y_i)\}_{1 \leq i \leq m}$.

no code implementations • 20 Jul 2016 • Rafael Reisenhofer, Sebastian Bosse, Gitta Kutyniok, Thomas Wiegand

In most practical situations, the compression or transmission of images and videos creates distortions that will eventually be perceived by a human observer.

Ranked #12 on Video Quality Assessment on MSU FR VQA Database

no code implementations • 11 Jun 2015 • Tim Conrad, Martin Genzel, Nada Cvetkovic, Niklas Wulkow, Alexander Leichtle, Jan Vybiral, Gitta Kutyniok, Christof Schütte

Results: We present a new algorithm, Sparse Proteomics Analysis (SPA), based on the theory of compressed sensing that allows us to identify a minimal discriminating set of features from mass spectrometry data-sets.

Cannot find the paper you are looking for? You can
Submit a new open access paper.

Contact us on:
hello@paperswithcode.com
.
Papers With Code is a free resource with all data licensed under CC-BY-SA.