In a semi-supervised setting, unlabelled data is used to improve the levels of accuracy and generalization of a model with small labelled datasets.
This is very helpful for the decision makers, especially when facing changing environments.
This results in a distribution mismatch between the unlabelled and labelled datasets.
The use of two popular and publicly available datasets (INbreast and CBIS-DDSM) as source data, to train and test the models on the novel target dataset, is evaluated.
This document describes the generalized moving peaks benchmark (GMPB) and how it can be used to generate problem instances for continuous large-scale dynamic optimization problems.
This document describes the Generalized Moving Peaks Benchmark (GMPB) that generates continuous dynamic optimization problem instances.
In this work, we propose MixMOOD - a systematic approach to mitigate effect of class distribution mismatch in semi-supervised deep learning (SSDL) with MixMatch.
The proposed framework is combined with new strategies, such as reference adaptation and adaptive local mating, to solve different types of problems.
It is demonstrated with empirical studies that the proposed test suite is more challenging to the dynamic multiobjective optimisation algorithms found in the literature.
In the evolutionary computation research community, the performance of most evolutionary algorithms (EAs) depends strongly on their implemented coordinate system.
Balancing convergence and diversity plays a key role in evolutionary multiobjective optimization (EMO).