no code implementations • 18 Oct 2021 • Nikolaos Myrtakis, Ioannis Tsamardinos, Vassilis Christophides
PROTEUS is designed to return an accurate estimate of out-of-sample predictive performance to serve as a metric of the quality of the approximation.
no code implementations • 9 Dec 2020 • Anastasios Tsourtis, Yannis Pantazis, Ioannis Tsamardinos
Inferring the driving equations of a dynamical system from population or time-course data is important in several scientific fields such as biochemistry, epidemiology, financial mathematics and many others.
no code implementations • 1 Apr 2020 • Michail Tsagris, Zacharias Papadovasilakis, Kleanthi Lakiotaki, Ioannis Tsamardinos
Feature selection for predictive analytics is the problem of identifying a minimal-size subset of features that is maximally predictive of an outcome of interest.
no code implementations • 23 Aug 2017 • Ioannis Tsamardinos, Elissavet Greasidou, Michalis Tsagris, Giorgos Borboudakis
BBC-CV's main idea is to bootstrap the whole process of selecting the best-performing configuration on the out-of-sample predictions of each configuration, without additional training of models.
no code implementations • 23 Aug 2017 • Ioannis Tsamardinos, Giorgos Borboudakis, Pavlos Katsogridakis, Polyvios Pratikakis, Vassilis Christophides
We present the Parallel, Forward-Backward with Pruning (PFBP) algorithm for feature selection (FS) in Big Data settings (high dimensionality and/or sample size).
no code implementations • 30 May 2017 • Giorgos Borboudakis, Ioannis Tsamardinos
In experiments we show that the proposed heuristic increases computational efficiency by about two orders of magnitude in high-dimensional problems, while selecting fewer variables and retaining predictive performance.
no code implementations • 10 Nov 2016 • Vincenzo Lagani, Giorgos Athineou, Alessio Farcomeni, Michail Tsagris, Ioannis Tsamardinos
The statistically equivalent signature (SES) algorithm is a method for feature selection inspired by the principles of constrained-based learning of Bayesian Networks.
no code implementations • 9 Aug 2014 • Giorgos Borboudakis, Ioannis Tsamardinos
A significant theoretical advantage of search-and-score methods for learning Bayesian Networks is that they can accept informative prior beliefs for each possible network, thus complementing the data.
no code implementations • 15 Apr 2014 • Christina Papagiannopoulou, Grigorios Tsoumakas, Ioannis Tsamardinos
Marginal probabilities are entered as soft evidence in the network and adjusted through probabilistic inference.
no code implementations • 10 Mar 2014 • Sofia Triantafillou, Ioannis Tsamardinos
In this work, we present algorithm COmbINE, which accepts a collection of data sets over overlapping variable sets under different experimental conditions; COmbINE then outputs a summary of all causal models indicating the invariant and variant structural characteristics of all models that simultaneously fit all of the input data sets.
no code implementations • 28 Sep 2012 • Giorgos Borboudakis, Ioannis Tsamardinos
A significant theoretical advantage of search-and-score methods for learning Bayesian Networks is that they can accept informative prior beliefs for each possible network, thus complementing the data.