Search Results for author: Matthias Schonlau

Found 8 papers, 5 papers with code

Multi-label classification of open-ended questions with BERT

no code implementations6 Apr 2023 Matthias Schonlau, Julia Weiß, Jan Marquardt

As expected, it is much easier to correctly predict answer texts that correspond to a single label (7. 1% loss) than those that correspond to multiple labels ($\sim$50% loss).

Classification Multi-Label Classification

One Line To Rule Them All: Generating LO-Shot Soft-Label Prototypes

1 code implementation15 Feb 2021 Ilia Sucholutsky, Nam-Hwui Kim, Ryan P. Browne, Matthias Schonlau

We propose a novel, modular method for generating soft-label prototypical lines that still maintains representational accuracy even when there are fewer prototypes than the number of classes in the data.

One-Shot Learning

Optimal 1-NN Prototypes for Pathological Geometries

1 code implementation31 Oct 2020 Ilia Sucholutsky, Matthias Schonlau

Using prototype methods to reduce the size of training datasets can drastically reduce the computational cost of classification with instance-based learning algorithms like the k-Nearest Neighbour classifier.

SecDD: Efficient and Secure Method for Remotely Training Neural Networks

1 code implementation19 Sep 2020 Ilia Sucholutsky, Matthias Schonlau

We leverage what are typically considered the worst qualities of deep learning algorithms - high computational cost, requirement for large data, no explainability, high dependence on hyper-parameter choice, overfitting, and vulnerability to adversarial perturbations - in order to create a method for the secure and efficient training of remotely deployed neural networks over unsecured channels.

'Less Than One'-Shot Learning: Learning N Classes From M<N Samples

3 code implementations17 Sep 2020 Ilia Sucholutsky, Matthias Schonlau

We propose the `less than one'-shot learning task where models must learn $N$ new classes given only $M<N$ examples and we show that this is achievable with the help of soft labels.

One-Shot Learning

Soft-Label Dataset Distillation and Text Dataset Distillation

3 code implementations6 Oct 2019 Ilia Sucholutsky, Matthias Schonlau

We propose to simultaneously distill both images and their labels, thus assigning each synthetic sample a `soft' label (a distribution of labels).

Data Summarization Image Classification +1

Deep Learning for System Trace Restoration

no code implementations10 Apr 2019 Ilia Sucholutsky, Apurva Narayan, Matthias Schonlau, Sebastian Fischmeister

The output of the model will be a close reconstruction of the true data, and can be fed to algorithms that rely on clean data.

Anomaly Detection

Nearest Labelset Using Double Distances for Multi-label Classification

no code implementations15 Feb 2017 Hyukjun Gweon, Matthias Schonlau, Stefan Steiner

In this paper we propose a novel approach, Nearest Labelset using Double Distances (NLDD), that predicts the labelset observed in the training data that minimizes a weighted sum of the distances in both the feature space and the label space to the new instance.

Classification General Classification +1

Cannot find the paper you are looking for? You can Submit a new open access paper.