1 code implementation • 11 Oct 2022 • Martin Khannouz, Tristan Glatard
Supervised learning algorithms generally assume the availability of enough memory to store data models during the training and test phases.
1 code implementation • 12 May 2022 • Martin Khannouz, Tristan Glatard
Supervised learning algorithms generally assume the availability of enough memory to store their data model during the training and test phases.
1 code implementation • 28 Jun 2021 • Marc Vicuna, Martin Khannouz, Gregory Kiar, Yohan Chatelain, Tristan Glatard
Mondrian Forests are a powerful data stream classification method, but their large memory footprint makes them ill-suited for low-resource platforms such as connected objects.
2 code implementations • 27 Aug 2020 • Martin Khannouz, Tristan Glatard
We measure both classification performance and resource consumption (runtime, memory, and power) of five usual stream classification algorithms, implemented in a consistent library, and applied to two real human activity datasets and to three synthetic datasets.