1 code implementation • 15 Aug 2022 • Florian Busch, Moritz Kulessa, Eneldo Loza Mencía, Hendrik Blockeel
A common approach to aggregate classification estimates in an ensemble of decision trees is to either use voting or to average the probabilities for each class.
1 code implementation • 13 Dec 2021 • Eneldo Loza Mencía, Moritz Kulessa, Simon Bohlender, Johannes Fürnkranz
However, the method requires a fixed, static order of the labels.
1 code implementation • 18 Oct 2021 • Michael Rapp, Moritz Kulessa, Eneldo Loza Mencía, Johannes Fürnkranz
Early outbreak detection is a key aspect in the containment of infectious diseases, as it enables the identification and isolation of infected individuals before the disease can spread to a larger population.
no code implementations • 22 Jun 2021 • Michael Rapp, Eneldo Loza Mencía, Johannes Fürnkranz, Eyke Hüllermeier
Based on the derivatives computed during training, we dynamically group the labels into a predefined number of bins to impose an upper bound on the dimensionality of the linear system.
1 code implementation • 28 Jan 2021 • Moritz Kulessa, Eneldo Loza Mencía, Johannes Fürnkranz
Infectious disease surveillance is of great importance for the prevention of major outbreaks.
no code implementations • 8 Dec 2020 • Johannes Fürnkranz, Eyke Hüllermeier, Eneldo Loza Mencía, Michael Rapp
Arguably the key reason for the success of deep neural networks is their ability to autonomously form non-linear combinations of the input features, which can be used in subsequent layers of the network.
1 code implementation • 23 Jun 2020 • Michael Rapp, Eneldo Loza Mencía, Johannes Fürnkranz, Vu-Linh Nguyen, Eyke Hüllermeier
In multi-label classification, where the evaluation of predictions is less straightforward than in single-label classification, various meaningful, though different, loss functions have been proposed.
no code implementations • 21 Jun 2020 • Vu-Linh Nguyen, Eyke Hüllermeier, Michael Rapp, Eneldo Loza Mencía, Johannes Fürnkranz
While a variety of ensemble methods for multilabel classification have been proposed in the literature, the question of how to aggregate the predictions of the individual members of the ensemble has received little attention so far.
no code implementations • 11 Nov 2019 • Michael Rapp, Eneldo Loza Mencía, Johannes Fürnkranz
We analyze the trade-off between model complexity and accuracy for random forests by breaking the trees up into individual classification rules and selecting a subset of them.
1 code implementation • 19 Aug 2019 • Yannik Klein, Michael Rapp, Eneldo Loza Mencía
Albeit the number of possible label combinations increases exponentially with the number of available labels, it has been shown that rules with multiple labels in their heads, which are a natural form to model local label dependencies, can be induced efficiently by exploiting certain properties of rule evaluation measures and pruning the label search space accordingly.
1 code implementation • 8 Aug 2019 • Michael Rapp, Eneldo Loza Mencía, Johannes Fürnkranz
Many rule learning algorithms employ a heuristic-guided search for rules that model regularities contained in the training data and it is commonly accepted that the choice of the heuristic has a significant impact on the predictive performance of the learner.
no code implementations • 17 Jul 2019 • Moritz Kulessa, Eneldo Loza Mencía, Johannes Fürnkranz
Our results on synthetic data show that it is challenging to improve the performance with a trainable fusion method based on machine learning.
1 code implementation • 14 Dec 2018 • Michael Rapp, Eneldo Loza Mencía, Johannes Fürnkranz
Exploiting dependencies between labels is considered to be crucial for multi-label classification.
1 code implementation • 30 Nov 2018 • Eneldo Loza Mencía, Johannes Fürnkranz, Eyke Hüllermeier, Michael Rapp
Multi-label classification (MLC) is a supervised learning problem in which, contrary to standard multiclass classification, an instance can be associated with several class labels simultaneously.
no code implementations • 2 Jul 2018 • Patryk Hopner, Eneldo Loza Mencía
Recently a strong poker-playing algorithm called DeepStack was published, which is able to find an approximate Nash equilibrium during gameplay by using heuristic values of future states predicted by deep neural networks.
no code implementations • NeurIPS 2017 • Jinseok Nam, Eneldo Loza Mencía, Hyunwoo J. Kim, Johannes Fürnkranz
Multi-label classification is the task of predicting a set of labels for a given input instance.
no code implementations • 19 Dec 2013 • Jinseok Nam, Jungi Kim, Eneldo Loza Mencía, Iryna Gurevych, Johannes Fürnkranz
Neural networks have recently been proposed for multi-label classification because they are able to capture and model label dependencies in the output layer.