Search Results for author: Peter Bühlmann

Found 39 papers, 21 papers with code

Structural Intervention Distance (SID) for Evaluating Causal Graphs

4 code implementations5 Jun 2013 Jonas Peters, Peter Bühlmann

To quantify such differences, we propose a (pre-) distance between DAGs, the structural intervention distance (SID).

Causal Inference

Random Forests for Change Point Detection

2 code implementations10 May 2022 Malte Londschien, Peter Bühlmann, Solt Kovács

However, the method can be paired with any classifier that yields class probability predictions, which we illustrate by also using a k-nearest neighbor classifier.

Change Point Detection

Domain adaptation under structural causal models

1 code implementation29 Oct 2020 Yuansi Chen, Peter Bühlmann

Domain adaptation (DA) arises as an important problem in statistical machine learning when the source data used to train a model is different from the target data used to test the model.

Domain Adaptation

repliclust: Synthetic Data for Cluster Analysis

1 code implementation24 Mar 2023 Michael J. Zellinger, Peter Bühlmann

We present repliclust (from repli-cate and clust-er), a Python package for generating synthetic data sets with clusters.

Robustifying Independent Component Analysis by Adjusting for Group-Wise Stationary Noise

3 code implementations4 Jun 2018 Niklas Pfister, Sebastian Weichwald, Peter Bühlmann, Bernhard Schölkopf

We introduce coroICA, confounding-robust independent component analysis, a novel ICA algorithm which decomposes linearly mixed multivariate observations into independent components that are corrupted (and rendered dependent) by hidden group-wise stationary confounding.

Causal Inference EEG +1

Anchor regression: heterogeneous data meets causality

2 code implementations18 Jan 2018 Dominik Rothenhäusler, Nicolai Meinshausen, Peter Bühlmann, Jonas Peters

If anchor regression and least squares provide the same answer (anchor stability), we establish that OLS parameters are invariant under certain distributional changes.

Methodology

On the Identifiability and Estimation of Causal Location-Scale Noise Models

1 code implementation13 Oct 2022 Alexander Immer, Christoph Schultheiss, Julia E. Vogt, Bernhard Schölkopf, Peter Bühlmann, Alexander Marx

We study the class of location-scale or heteroscedastic noise models (LSNMs), in which the effect $Y$ can be written as a function of the cause $X$ and a noise source $N$ independent of $X$, which may be scaled by a positive function $g$ over the cause, i. e., $Y = f(X) + g(X)N$.

Causal Discovery Causal Inference

Change point detection for graphical models in the presence of missing values

1 code implementation11 Jul 2019 Malte Londschien, Solt Kovács, Peter Bühlmann

We propose estimation methods for change points in high-dimensional covariance structures with an emphasis on challenging scenarios with missing values.

Change Point Detection Imputation +3

Goodness-of-fit testing in high-dimensional generalized linear models

2 code implementations9 Aug 2019 Jana Janková, Rajen D. Shah, Peter Bühlmann, Richard J. Samworth

We propose a family of tests to assess the goodness-of-fit of a high-dimensional generalized linear model.

Methodology Statistics Theory Statistics Theory

Stabilizing Variable Selection and Regression

1 code implementation5 Nov 2019 Niklas Pfister, Evan G. Williams, Jonas Peters, Ruedi Aebersold, Peter Bühlmann

In particular, it is useful to distinguish between stable and unstable predictors, i. e., predictors which have a fixed or a changing functional dependence on the response, respectively.

Methodology Applications

Doubly Debiased Lasso: High-Dimensional Inference under Hidden Confounding

1 code implementation8 Apr 2020 Zijian Guo, Domagoj Ćevid, Peter Bühlmann

Inferring causal relationships or related associations from observational data can be invalidated by the existence of hidden confounding.

Methodology Statistics Theory Statistics Theory

Structure Learning for Directed Trees

1 code implementation19 Aug 2021 Martin Emil Jakobsen, Rajen D. Shah, Peter Bühlmann, Jonas Peters

Furthermore, we study the identifiability gap, which quantifies how much better the true causal model fits the observational distribution, and prove that it is lower bounded by local properties of the causal model.

valid

Distributional Anchor Regression

1 code implementation20 Jan 2021 Lucas Kook, Beate Sick, Peter Bühlmann

In a causally inspired perspective on OOD generalization, the test data arise from a specific class of interventions on exogenous random variables of the DGP, called anchors.

Methodology

Regularizing Double Machine Learning in Partially Linear Endogenous Models

1 code implementation29 Jan 2021 Corinne Emmenegger, Peter Bühlmann

The linear coefficient in a partially linear model with confounding variables can be estimated using double machine learning (DML).

Methodology Statistics Theory Statistics Theory

Double Machine Learning for Partially Linear Mixed-Effects Models with Repeated Measurements

1 code implementation31 Aug 2021 Corinne Emmenegger, Peter Bühlmann

Traditionally, spline or kernel approaches in combination with parametric estimation are used to infer the linear coefficient (fixed effects) in a partially linear mixed-effects model for repeated measurements.

BIG-bench Machine Learning

Causality-oriented robustness: exploiting general additive interventions

1 code implementation18 Jul 2023 Xinwei Shen, Peter Bühlmann, Armeen Taeb

In a linear setting, we prove that DRIG yields predictions that are robust among a data-dependent class of distribution shifts.

Causal Inference Domain Adaptation +1

Invariant Probabilistic Prediction

1 code implementation18 Sep 2023 Alexander Henzi, Xinwei Shen, Michael Law, Peter Bühlmann

In recent years, there has been a growing interest in statistical methods that exhibit robust performance under distribution changes between training and test data.

Kernel-based Tests for Joint Independence

no code implementations1 Mar 2016 Niklas Pfister, Peter Bühlmann, Bernhard Schölkopf, Jonas Peters

Based on an empirical estimate of dHSIC, we define three different non-parametric hypothesis tests: a permutation test, a bootstrap test and a test based on a Gamma approximation.

Causal Discovery

Score-based Causal Learning in Additive Noise Models

no code implementations25 Nov 2013 Christopher Nowzohour, Peter Bühlmann

Given data sampled from a number of variables, one is often interested in the underlying causal relationships in the form of a directed acyclic graph.

Model Selection

High-dimensional learning of linear causal networks via inverse covariance estimation

no code implementations14 Nov 2013 Po-Ling Loh, Peter Bühlmann

We establish a new framework for statistical estimation of directed acyclic graphs (DAGs) when data are generated from a linear, possibly non-Gaussian structural equation model.

Vocal Bursts Intensity Prediction

Identifiability of Gaussian structural equation models with equal error variances

no code implementations11 May 2012 Jonas Peters, Peter Bühlmann

In this work, we prove full identifiability if all noise variables have the same variances: the directed acyclic graph can be recovered from the joint Gaussian distribution.

Causal Inference

Causal inference using invariant prediction: identification and confidence intervals

no code implementations6 Jan 2015 Jonas Peters, Peter Bühlmann, Nicolai Meinshausen

In contrast, predictions from a non-causal model can potentially be very wrong if we actively intervene on variables.

Methodology

Deconfounding and Causal Regularization for Stability and External Validity

no code implementations14 Aug 2020 Peter Bühlmann, Domagoj Ćevid

We review some recent work on removing hidden confounding and causal regularization from a unified viewpoint.

Seeded intervals and noise level estimation in change point detection: A discussion of Fryzlewicz (2020)

no code implementations23 Jun 2020 Solt Kovács, Housen Li, Peter Bühlmann

In this discussion, we compare the choice of seeded intervals and that of random intervals for change point segmentation from practical, statistical and computational perspectives.

Methodology Computation

Optimistic search: Change point estimation for large-scale data via adaptive logarithmic queries

no code implementations20 Oct 2020 Solt Kovács, Housen Li, Lorenz Haubner, Axel Munk, Peter Bühlmann

Change point estimation is often formulated as a search for the maximum of a gain function describing improved fits when segmenting the data.

Change Point Detection

Graphical Elastic Net and Target Matrices: Fast Algorithms and Software for Sparse Precision Matrix Estimation

no code implementations6 Jan 2021 Solt Kovács, Tobias Ruckstuhl, Helena Obrist, Peter Bühlmann

We consider estimation of undirected Gaussian graphical models and inverse covariances in high-dimensional scenarios by penalizing the corresponding precision matrix.

Methodology Computation

Confidence and Uncertainty Assessment for Distributional Random Forests

no code implementations11 Feb 2023 Jeffrey Näf, Corinne Emmenegger, Peter Bühlmann, Nicolai Meinshausen

The Distributional Random Forest (DRF) is a recently introduced Random Forest algorithm to estimate multivariate conditional distributions.

Distributionally Robust Machine Learning with Multi-source Data

no code implementations5 Sep 2023 Zhenyu Wang, Peter Bühlmann, Zijian Guo

Classical machine learning methods may lead to poor prediction performance when the target distribution differs from the source populations.

Federated Learning

Assessing the overall and partial causal well-specification of nonlinear additive noise models

no code implementations25 Oct 2023 Christoph Schultheiss, Peter Bühlmann

We propose a method to detect model misspecifications in nonlinear causal additive and potentially heteroscedastic noise models.

Extrapolation-Aware Nonparametric Statistical Inference

1 code implementation15 Feb 2024 Niklas Pfister, Peter Bühlmann

In this work, we extend the nonparametric statistical model to explicitly allow for extrapolation and introduce a class of extrapolation assumptions that can be combined with existing inference techniques to draw extrapolation-aware conclusions.

Uncertainty Quantification

Cannot find the paper you are looking for? You can Submit a new open access paper.