no code implementations • 22 Dec 2023 • Juliano Pinto, Georg Hess, Yuxuan Xia, Henk Wymeersch, Lennart Svensson
Multi-object tracking (MOT) is the task of estimating the state trajectories of an unknown and time-varying number of objects over a certain time window.
1 code implementation • 16 Feb 2022 • Juliano Pinto, Georg Hess, William Ljungbergh, Yuxuan Xia, Henk Wymeersch, Lennart Svensson
Multi-object tracking (MOT) is the problem of tracking the state of an unknown and time-varying number of objects using noisy measurements, with important applications such as autonomous driving, tracking animal behavior, defense systems, and others.
no code implementations • 10 Aug 2021 • Juliano Pinto, Yuxuan Xia, Lennart Svensson, Henk Wymeersch
Evaluating the performance of multi-object tracking (MOT) methods is not straightforward, and existing performance measures fail to consider all the available uncertainty information in the MOT context.
1 code implementation • 1 Apr 2021 • Juliano Pinto, Georg Hess, William Ljungbergh, Yuxuan Xia, Lennart Svensson, Henk Wymeersch
We show that the proposed model outperforms state-of-the-art Bayesian filters in complex scenarios, while matching their performance in simpler cases, which validates the applicability of deep-learning also in the model-based regime.
1 code implementation • 17 Jul 2020 • Wilhelm Tranheden, Viktor Olsson, Juliano Pinto, Lennart Svensson
In this paper we address the problem of unsupervised domain adaptation (UDA), which attempts to train on labelled data from one domain (source domain), and simultaneously learn from unlabelled data in the domain of interest (target domain).
Ranked #14 on Domain Adaptation on Cityscapes to ACDC
2 code implementations • 15 Jul 2020 • Viktor Olsson, Wilhelm Tranheden, Juliano Pinto, Lennart Svensson
A key challenge is that common augmentations used in semi-supervised classification are less effective for semantic segmentation.
no code implementations • 14 Dec 2019 • John Moberg, Lennart Svensson, Juliano Pinto, Henk Wymeersch
A simple approach to obtaining uncertainty-aware neural networks for regression is to do Bayesian linear regression (BLR) on the representation from the last hidden layer.