Search Results for author: Mihály Petreczky

Found 5 papers, 1 papers with code

Optimization dependent generalization bound for ReLU networks based on sensitivity in the tangent bundle

1 code implementation26 Oct 2023 Dániel Rácz, Mihály Petreczky, András Csertán, Bálint Daróczy

Recent advances in deep learning have given us some very promising results on the generalization ability of deep neural networks, however literature still lacks a comprehensive theory explaining why heavily over-parametrized models are able to generalize well while fitting the training data.

Theoretical Evaluation of Asymmetric Shapley Values for Root-Cause Analysis

no code implementations15 Oct 2023 Domokos M. Kelen, Mihály Petreczky, Péter Kersch, András A. Benczúr

In this work, we examine Asymmetric Shapley Values (ASV), a variant of the popular SHAP additive local explanation method.

Additive models

PAC bounds of continuous Linear Parameter-Varying systems related to neural ODEs

no code implementations7 Jul 2023 Dániel Rácz, Mihály Petreczky, Bálint Daróczy

We consider the problem of learning Neural Ordinary Differential Equations (neural ODEs) within the context of Linear Parameter-Varying (LPV) systems in continuous-time.

Learning stability of partially observed switched linear systems

no code implementations19 Jan 2023 Zheming Wang, Raphaël M. Jungers, Mihály Petreczky, Bo Chen, Li Yu

In this paper, we propose an algorithm for deciding stability of switched linear systems under arbitrary switching based purely on observed output data.

Learning Theory

LPV Modeling of Nonlinear Systems: A Multi-Path Feedback Linearization Approach

no code implementations26 Mar 2021 Hossam S. Abbas, Roland Tóth, Mihály Petreczky, Nader Meskin, Javad Mohammadpour Velni, Patrick J. W. Koelewijn

In the SISO case, all nonlinearities of the original system are embedded into one NL function, which is factorized, based on a proposed algorithm, to construct an LPV representation of the original NL system.

Scheduling

Cannot find the paper you are looking for? You can Submit a new open access paper.