1 code implementation • 23 May 2023 • Julian Lienen, Eyke Hüllermeier
Label noise poses an important challenge in machine learning, especially in deep learning, in which large models with high expressive power dominate the field.
no code implementations • 30 Apr 2023 • Svenja Uhlemeyer, Julian Lienen, Eyke Hüllermeier, Hanno Gottschalk
We thereafter extend the DNN by $k$ empty classes and fine-tune it on the OoD data samples.
1 code implementation • 11 Jun 2022 • Duc Anh Nguyen, Ron Levie, Julian Lienen, Gitta Kutyniok, Eyke Hüllermeier
The notion of neural collapse refers to several emergent phenomena that have been empirically observed across various canonical classification problems.
1 code implementation • 30 May 2022 • Julian Lienen, Caglar Demir, Eyke Hüllermeier
One such method, so-called credal self-supervised learning, maintains pseudo-supervision in the form of sets of (instead of single) probability distributions over labels, thereby allowing for a flexible yet uncertainty-aware labeling.
1 code implementation • 13 May 2022 • Caglar Demir, Julian Lienen, Axel-Cyrille Ngonga Ngomo
Our experiments suggest that applying Kronecker decomposition on embedding matrices leads to an improved parameter efficiency on all benchmark datasets.
1 code implementation • NeurIPS 2021 • Julian Lienen, Eyke Hüllermeier
In our approach, we therefore allow the learner to label instances in the form of credal sets, that is, sets of (candidate) probability distributions.
1 code implementation • CVPR 2021 • Julian Lienen, Eyke Hüllermeier, Ralph Ewerth, Nils Nommensen
In many real-world applications, the relative depth of objects in an image is crucial for scene understanding.