Search Results for author: Eibe Frank

Found 23 papers, 13 papers with code

Building Ensembles of Adaptive Nested Dichotomies with Random-Pair Selection

no code implementations7 Apr 2016 Tim Leathart, Bernhard Pfahringer, Eibe Frank

A system of nested dichotomies is a method of decomposing a multi-class problem into a collection of binary problems.

General Classification

Improving Naive Bayes for Regression with Optimised Artificial Surrogate Data

no code implementations16 Jul 2017 Michael Mayo, Eibe Frank

To investigate this question we use population-based optimisation algorithms to generate artificial surrogate training data for naive Bayes for regression.

BIG-bench Machine Learning regression

Regularisation of Neural Networks by Enforcing Lipschitz Continuity

1 code implementation12 Apr 2018 Henry Gouk, Eibe Frank, Bernhard Pfahringer, Michael J. Cree

We investigate the effect of explicitly enforcing the Lipschitz continuity of neural networks with respect to their inputs.

MaxGain: Regularisation of Neural Networks by Constraining Activation Magnitudes

no code implementations16 Apr 2018 Henry Gouk, Bernhard Pfahringer, Eibe Frank, Michael Cree

Effective regularisation of neural networks is essential to combat overfitting due to the large number of parameters involved.

XGBoost: Scalable GPU Accelerated Learning

1 code implementation29 Jun 2018 Rory Mitchell, Andrey Adinets, Thejaswi Rao, Eibe Frank

We describe the multi-GPU gradient boosting algorithm implemented in the XGBoost library (https://github. com/dmlc/xgboost).

Cloud Computing Data Compression

Probability Calibration Trees

no code implementations31 Jul 2018 Tim Leathart, Eibe Frank, Geoffrey Holmes, Bernhard Pfahringer

Obtaining accurate and well calibrated probability estimates from classifiers is useful in many applications, for example, when minimising the expected cost of classifications.

regression

On the Calibration of Nested Dichotomies for Large Multiclass Tasks

no code implementations8 Sep 2018 Tim Leathart, Eibe Frank, Bernhard Pfahringer, Geoffrey Holmes

Nested dichotomies are used as a method of transforming a multiclass classification problem into a series of binary problems.

Binary Classification General Classification

Ensembles of Nested Dichotomies with Multiple Subset Evaluation

no code implementations8 Sep 2018 Tim Leathart, Eibe Frank, Bernhard Pfahringer, Geoffrey Holmes

A system of nested dichotomies is a method of decomposing a multi-class problem into a collection of binary problems.

Stochastic Gradient Trees

1 code implementation23 Jan 2019 Henry Gouk, Bernhard Pfahringer, Eibe Frank

We present an algorithm for learning decision trees using stochastic gradient information as the source of supervision.

Classification General Classification +3

Classifier Chains: A Review and Perspectives

no code implementations26 Dec 2019 Jesse Read, Bernhard Pfahringer, Geoff Holmes, Eibe Frank

This performance led to further studies of how exactly it works, and how it could be improved, and in the recent decade numerous studies have explored classifier chains mechanisms on a theoretical level, and many improvements have been made to the training and inference procedures, such that this method remains among the state-of-the-art options for multi-label learning.

Multi-Label Classification Multi-Label Learning

Embedding Java Classes with code2vec: Improvements from Variable Obfuscation

1 code implementation6 Apr 2020 Rhys Compton, Eibe Frank, Panos Patros, Abigail Koay

code2vec is a recently released embedding approach that uses the proxy task of method name prediction to map Java methods to feature vectors.

Code Classification Method name prediction

Adaptive XGBoost for Evolving Data Streams

1 code implementation15 May 2020 Jacob Montiel, Rory Mitchell, Eibe Frank, Bernhard Pfahringer, Talel Abdessalem, Albert Bifet

The proposed method creates new members of the ensemble from mini-batches of data as new data becomes available.

General Classification

GPUTreeShap: Massively Parallel Exact Calculation of SHAP Scores for Tree Ensembles

4 code implementations27 Oct 2020 Rory Mitchell, Eibe Frank, Geoffrey Holmes

SHAP (SHapley Additive exPlanation) values provide a game theoretic interpretation of the predictions of machine learning models based on Shapley values.

BIG-bench Machine Learning

Sampling Permutations for Shapley Value Estimation

no code implementations25 Apr 2021 Rory Mitchell, Joshua Cooper, Eibe Frank, Geoffrey Holmes

Game-theoretic attribution techniques based on Shapley values are used to interpret black-box machine learning models, but their exact calculation is generally NP-hard, requiring approximation methods for non-trivial models.

Transfer of Pretrained Model Weights Substantially Improves Semi-Supervised Image Classification

2 code implementations2 Sep 2021 Attaullah Sahito, Eibe Frank, Bernhard Pfahringer

Deep neural networks produce state-of-the-art results when trained on a large number of labeled examples but tend to overfit when small amounts of labeled examples are used for training.

Metric Learning Self-Learning +2

Better Self-training for Image Classification through Self-supervision

3 code implementations2 Sep 2021 Attaullah Sahito, Eibe Frank, Bernhard Pfahringer

Self-training is a simple semi-supervised learning approach: Unlabelled examples that attract high-confidence predictions are labelled with their predictions and added to the training set, with this process being repeated multiple times.

Classification Image Classification

Semi-Supervised Learning using Siamese Networks

2 code implementations2 Sep 2021 Attaullah Sahito, Eibe Frank, Bernhard Pfahringer

This work explores a new training method for semi-supervised learning that is based on similarity function learning using a Siamese network to obtain a suitable embedding.

Closed-loop Control for Online Continual Learning

no code implementations29 Sep 2021 Yaqian Zhang, Eibe Frank, Bernhard Pfahringer, Albert Bifet, Nick Jin Sean Lim, Alvin Jia

To address the non-stationarity in the continual learning environment, we employ a Q function with task-specific and task-shared components to support fast adaptation.

Continual Learning

Hitting the Target: Stopping Active Learning at the Cost-Based Optimum

1 code implementation7 Oct 2021 Zac Pullar-Strecker, Katharina Dost, Eibe Frank, Jörg Wicker

This work enables practitioners to employ active learning by providing actionable recommendations for which stopping criteria are best for a given real-world scenario.

Active Learning

Feature Extractor Stacking for Cross-domain Few-shot Learning

1 code implementation12 May 2022 Hongyu Wang, Eibe Frank, Bernhard Pfahringer, Michael Mayo, Geoffrey Holmes

Recently published CDFSL methods generally construct a universal model that combines knowledge of multiple source domains into one feature extractor.

cross-domain few-shot learning Image Classification

A simple but strong baseline for online continual learning: Repeated Augmented Rehearsal

1 code implementation28 Sep 2022 Yaqian Zhang, Bernhard Pfahringer, Eibe Frank, Albert Bifet, Nick Jin Sean Lim, Yunzhe Jia

Despite its strong empirical performance, rehearsal methods still suffer from a poor approximation of the loss landscape of past data with memory samples.

Continual Learning Reinforcement Learning (RL)

Cannot find the paper you are looking for? You can Submit a new open access paper.