Search Results for author: Matthew Reimherr

Found 13 papers, 3 papers with code

Smoothness Adaptive Hypothesis Transfer Learning

no code implementations22 Feb 2024 Haotian Lin, Matthew Reimherr

We first prove that employing the misspecified fixed bandwidth Gaussian kernel in target-only KRR learning can achieve minimax optimality and derive an adaptive procedure to the unknown Sobolev smoothness.

Transfer Learning

Pure Differential Privacy for Functional Summaries via a Laplace-like Process

no code implementations31 Aug 2023 Haotian Lin, Matthew Reimherr

Many existing mechanisms to achieve differential privacy (DP) on infinite-dimensional functional summaries often involve embedding these summaries into finite-dimensional subspaces and applying traditional DP techniques.

Shape And Structure Preserving Differential Privacy

no code implementations21 Sep 2022 Carlos Soto, Karthik Bharath, Matthew Reimherr, Aleksandra Slavkovic

It is common for data structures such as images and shapes of 2D objects to be represented as points on a manifold.

On Hypothesis Transfer Learning of Functional Linear Models

no code implementations9 Jun 2022 Haotian Lin, Matthew Reimherr

We study the transfer learning (TL) for the functional linear regression (FLR) under the Reproducing Kernel Hilbert Space (RKHS) framework, observing the TL techniques in existing high-dimensional linear regression is not compatible with the truncation-based FLR methods as functional data are intrinsically infinite-dimensional and generated by smooth underlying processes.

regression Transfer Learning

A Highly-Efficient Group Elastic Net Algorithm with an Application to Function-On-Scalar Regression

no code implementations NeurIPS 2021 Tobia Boschi, Matthew Reimherr, Francesca Chiaromonte

Feature Selection and Functional Data Analysis are two dynamic areas of research, with important applications in the analysis of large and complex data sets.

feature selection regression

Modern Non-Linear Function-on-Function Regression

no code implementations29 Jul 2021 Aniruddha Rajendra Rao, Matthew Reimherr

We introduce a new class of non-linear function-on-function regression models for functional data using neural networks.

regression

Non-linear Functional Modeling using Neural Networks

no code implementations19 Apr 2021 Aniruddha Rajendra Rao, Matthew Reimherr

We introduce a new class of non-linear models for functional data based on neural networks.

Modern Multiple Imputation with Functional Data

no code implementations25 Nov 2020 Aniruddha Rajendra Rao, Matthew Reimherr

This work considers the problem of fitting functional models with sparsely and irregularly sampled functional data.

Imputation

An Efficient Semi-smooth Newton Augmented Lagrangian Method for Elastic Net

1 code implementation6 Jun 2020 Tobia Boschi, Matthew Reimherr, Francesca Chiaromonte

Our new algorithm exploits both the sparsity induced by the Elastic Net penalty and the sparsity due to the second order information of the augmented Lagrangian.

feature selection

Fast and Fair Simultaneous Confidence Bands for Functional Parameters

1 code implementation30 Sep 2019 Dominik Liebl, Matthew Reimherr

Quantifying uncertainty using confidence regions is a central goal of statistical inference.

Methodology Statistics Theory Statistics Theory

KNG: The K-Norm Gradient Mechanism

no code implementations NeurIPS 2019 Matthew Reimherr, Jordan Awan

This paper presents a new mechanism for producing sanitized statistical summaries that achieve \emph{differential privacy}, called the \emph{K-Norm Gradient} Mechanism, or KNG.

Benefits and Pitfalls of the Exponential Mechanism with Applications to Hilbert Spaces and Functional PCA

no code implementations30 Jan 2019 Jordan Awan, Ana Kenney, Matthew Reimherr, Aleksandra Slavković

We study its extension to settings with summaries based on infinite dimensional outputs such as with functional data analysis, shape analysis, and nonparametric statistics.

Cannot find the paper you are looking for? You can Submit a new open access paper.