Search Results for author: Sebastian Weichwald

Found 15 papers, 8 papers with code

Compositional Abstraction Error and a Category of Causal Models

no code implementations29 Mar 2021 Eigil F. Rischel, Sebastian Weichwald

Interventional causal models describe several joint distributions over some variables used to describe a system, one for each intervention setting.

Beware of the Simulated DAG! Causal Discovery Benchmarks May Be Easy To Game

1 code implementation NeurIPS 2021 Alexander G. Reisach, Christof Seiler, Sebastian Weichwald

Here, we show that marginal variance tends to increase along the causal order for generically sampled additive noise models.

Causal Discovery

Causal structure learning from time series: Large regression coefficients may predict causal links better in practice than small p-values

1 code implementation21 Feb 2020 Sebastian Weichwald, Martin E Jakobsen, Phillip B Mogensen, Lasse Petersen, Nikolaj Thams, Gherardo Varando

In this article, we describe the algorithms for causal structure learning from time series data that won the Causality 4 Climate competition at the Conference on Neural Information Processing Systems 2019 (NeurIPS).

Time Series

Robustifying Independent Component Analysis by Adjusting for Group-Wise Stationary Noise

3 code implementations4 Jun 2018 Niklas Pfister, Sebastian Weichwald, Peter Bühlmann, Bernhard Schölkopf

We introduce coroICA, confounding-robust independent component analysis, a novel ICA algorithm which decomposes linearly mixed multivariate observations into independent components that are corrupted (and rendered dependent) by hidden group-wise stationary confounding.

Causal Inference EEG

A note on the expected minimum error probability in equientropic channels

no code implementations23 May 2016 Sebastian Weichwald, Tatiana Fomina, Bernhard Schölkopf, Moritz Grosse-Wentrup

While the channel capacity reflects a theoretical upper bound on the achievable information transmission rate in the limit of infinitely many bits, it does not characterise the information transfer of a given encoding routine with finitely many bits.

Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation

1 code implementation10 Mar 2016 James Townsend, Niklas Koep, Sebastian Weichwald

Optimization on manifolds is a class of methods for optimization of an objective function, subject to constraints which are smooth, in the sense that the set of points which satisfy the constraints admits the structure of a differentiable manifold.

Riemannian optimization

Causal and anti-causal learning in pattern recognition for neuroimaging

no code implementations15 Dec 2015 Sebastian Weichwald, Bernhard Schölkopf, Tonio Ball, Moritz Grosse-Wentrup

Pattern recognition in neuroimaging distinguishes between two types of models: encoding- and decoding models.

Causal Inference

Decoding index finger position from EEG using random forests

no code implementations14 Dec 2015 Sebastian Weichwald, Timm Meyer, Bernhard Schölkopf, Tonio Ball, Moritz Grosse-Wentrup

While invasively recorded brain activity is known to provide detailed information on motor commands, it is an open question at what level of detail information about positions of body parts can be decoded from non-invasively acquired signals.

EEG

MERLiN: Mixture Effect Recovery in Linear Networks

1 code implementation3 Dec 2015 Sebastian Weichwald, Moritz Grosse-Wentrup, Arthur Gretton

Causal inference concerns the identification of cause-effect relationships between variables, e. g. establishing whether a stimulus affects activity in a certain brain region.

Causal Inference EEG

Causal interpretation rules for encoding and decoding models in neuroimaging

no code implementations15 Nov 2015 Sebastian Weichwald, Timm Meyer, Ozan Özdenizci, Bernhard Schölkopf, Tonio Ball, Moritz Grosse-Wentrup

Causal terminology is often introduced in the interpretation of encoding and decoding models trained on neuroimaging data.

EEG

Cannot find the paper you are looking for? You can Submit a new open access paper.