Search Results for author: Ioannis Tsamardinos

Found 11 papers, 0 papers with code

On Predictive Explanation of Data Anomalies

no code implementations18 Oct 2021 Nikolaos Myrtakis, Ioannis Tsamardinos, Vassilis Christophides

PROTEUS is designed to return an accurate estimate of out-of-sample predictive performance to serve as a metric of the quality of the approximation.

AutoML Feature Importance +1

Inference of Stochastic Dynamical Systems from Cross-Sectional Population Data

no code implementations9 Dec 2020 Anastasios Tsourtis, Yannis Pantazis, Ioannis Tsamardinos

Inferring the driving equations of a dynamical system from population or time-course data is important in several scientific fields such as biochemistry, epidemiology, financial mathematics and many others.

Epidemiology

A generalised OMP algorithm for feature selection with application to gene expression data

no code implementations1 Apr 2020 Michail Tsagris, Zacharias Papadovasilakis, Kleanthi Lakiotaki, Ioannis Tsamardinos

Feature selection for predictive analytics is the problem of identifying a minimal-size subset of features that is maximally predictive of an outcome of interest.

feature selection

Bootstrapping the Out-of-sample Predictions for Efficient and Accurate Cross-Validation

no code implementations23 Aug 2017 Ioannis Tsamardinos, Elissavet Greasidou, Michalis Tsagris, Giorgos Borboudakis

BBC-CV's main idea is to bootstrap the whole process of selecting the best-performing configuration on the out-of-sample predictions of each configuration, without additional training of models.

Massively-Parallel Feature Selection for Big Data

no code implementations23 Aug 2017 Ioannis Tsamardinos, Giorgos Borboudakis, Pavlos Katsogridakis, Polyvios Pratikakis, Vassilis Christophides

We present the Parallel, Forward-Backward with Pruning (PFBP) algorithm for feature selection (FS) in Big Data settings (high dimensionality and/or sample size).

feature selection

Forward-Backward Selection with Early Dropping

no code implementations30 May 2017 Giorgos Borboudakis, Ioannis Tsamardinos

In experiments we show that the proposed heuristic increases computational efficiency by about two orders of magnitude in high-dimensional problems, while selecting fewer variables and retaining predictive performance.

feature selection

Feature Selection with the R Package MXM: Discovering Statistically-Equivalent Feature Subsets

no code implementations10 Nov 2016 Vincenzo Lagani, Giorgos Athineou, Alessio Farcomeni, Michail Tsagris, Ioannis Tsamardinos

The statistically equivalent signature (SES) algorithm is a method for feature selection inspired by the principles of constrained-based learning of Bayesian Networks.

feature selection Survival Analysis

Scoring and Searching over Bayesian Networks with Causal and Associative Priors

no code implementations9 Aug 2014 Giorgos Borboudakis, Ioannis Tsamardinos

A significant theoretical advantage of search-and-score methods for learning Bayesian Networks is that they can accept informative prior beliefs for each possible network, thus complementing the data.

Discovering and Exploiting Entailment Relationships in Multi-Label Learning

no code implementations15 Apr 2014 Christina Papagiannopoulou, Grigorios Tsoumakas, Ioannis Tsamardinos

Marginal probabilities are entered as soft evidence in the network and adjusted through probabilistic inference.

Multi-Label Learning

Constraint-based Causal Discovery from Multiple Interventions over Overlapping Variable Sets

no code implementations10 Mar 2014 Sofia Triantafillou, Ioannis Tsamardinos

In this work, we present algorithm COmbINE, which accepts a collection of data sets over overlapping variable sets under different experimental conditions; COmbINE then outputs a summary of all causal models indicating the invariant and variant structural characteristics of all models that simultaneously fit all of the input data sets.

Causal Discovery

Scoring and Searching over Bayesian Networks with Causal and Associative Priors

no code implementations28 Sep 2012 Giorgos Borboudakis, Ioannis Tsamardinos

A significant theoretical advantage of search-and-score methods for learning Bayesian Networks is that they can accept informative prior beliefs for each possible network, thus complementing the data.

Cannot find the paper you are looking for? You can Submit a new open access paper.