no code implementations • 22 Feb 2024 • Haotian Lin, Matthew Reimherr
We first prove that employing the misspecified fixed bandwidth Gaussian kernel in target-only KRR learning can achieve minimax optimality and derive an adaptive procedure to the unknown Sobolev smoothness.
no code implementations • 31 Aug 2023 • Haotian Lin, Matthew Reimherr
Many existing mechanisms to achieve differential privacy (DP) on infinite-dimensional functional summaries often involve embedding these summaries into finite-dimensional subspaces and applying traditional DP techniques.
1 code implementation • 26 Mar 2023 • Tobia Boschi, Lorenzo Testa, Francesca Chiaromonte, Matthew Reimherr
Functional regression analysis is an established tool for many contemporary scientific applications.
no code implementations • 21 Sep 2022 • Carlos Soto, Karthik Bharath, Matthew Reimherr, Aleksandra Slavkovic
It is common for data structures such as images and shapes of 2D objects to be represented as points on a manifold.
no code implementations • 9 Jun 2022 • Haotian Lin, Matthew Reimherr
We study the transfer learning (TL) for the functional linear regression (FLR) under the Reproducing Kernel Hilbert Space (RKHS) framework, observing the TL techniques in existing high-dimensional linear regression is not compatible with the truncation-based FLR methods as functional data are intrinsically infinite-dimensional and generated by smooth underlying processes.
no code implementations • NeurIPS 2021 • Tobia Boschi, Matthew Reimherr, Francesca Chiaromonte
Feature Selection and Functional Data Analysis are two dynamic areas of research, with important applications in the analysis of large and complex data sets.
no code implementations • 29 Jul 2021 • Aniruddha Rajendra Rao, Matthew Reimherr
We introduce a new class of non-linear function-on-function regression models for functional data using neural networks.
no code implementations • 19 Apr 2021 • Aniruddha Rajendra Rao, Matthew Reimherr
We introduce a new class of non-linear models for functional data based on neural networks.
no code implementations • 25 Nov 2020 • Aniruddha Rajendra Rao, Matthew Reimherr
This work considers the problem of fitting functional models with sparsely and irregularly sampled functional data.
1 code implementation • 6 Jun 2020 • Tobia Boschi, Matthew Reimherr, Francesca Chiaromonte
Our new algorithm exploits both the sparsity induced by the Elastic Net penalty and the sparsity due to the second order information of the augmented Lagrangian.
1 code implementation • 30 Sep 2019 • Dominik Liebl, Matthew Reimherr
Quantifying uncertainty using confidence regions is a central goal of statistical inference.
Methodology Statistics Theory Statistics Theory
no code implementations • NeurIPS 2019 • Matthew Reimherr, Jordan Awan
This paper presents a new mechanism for producing sanitized statistical summaries that achieve \emph{differential privacy}, called the \emph{K-Norm Gradient} Mechanism, or KNG.
no code implementations • 30 Jan 2019 • Jordan Awan, Ana Kenney, Matthew Reimherr, Aleksandra Slavković
We study its extension to settings with summaries based on infinite dimensional outputs such as with functional data analysis, shape analysis, and nonparametric statistics.