no code implementations • 5 Sep 2024 • Felix Sattler, Borja Carrillo Perez, Maurice Stephan, Sarah Barnes
We introduce a novel method for updating 3D geospatial models, specifically targeting occlusion removal in large-scale maritime environments.
1 code implementation • 30 Aug 2024 • Edgardo Solano-Carrillo, Felix Sattler, Antje Alex, Alexander Klein, Bruno Pereira Costa, Angel Bueno Rodriguez, Jannis Stoppe
The tracking-by-detection paradigm is the mainstream in multi-object tracking, associating tracks to the predictions of an object detector.
no code implementations • 23 Nov 2023 • Benjamin Kiefer, Lojze Žust, Matej Kristan, Janez Perš, Matija Teršek, Arnold Wiliem, Martin Messmer, Cheng-Yen Yang, Hsiang-Wei Huang, Zhongyu Jiang, Heng-Cheng Kuo, Jie Mei, Jenq-Neng Hwang, Daniel Stadler, Lars Sommer, Kaer Huang, Aiguo Zheng, Weitu Chong, Kanokphan Lertniphonphan, Jun Xie, Feng Chen, Jian Li, Zhepeng Wang, Luca Zedda, Andrea Loddo, Cecilia Di Ruberto, Tuan-Anh Vu, Hai Nguyen-Truong, Tan-Sang Ha, Quan-Dung Pham, Sai-Kit Yeung, Yuan Feng, Nguyen Thanh Thien, Lixin Tian, Sheng-Yao Kuan, Yuan-Hao Ho, Angel Bueno Rodriguez, Borja Carrillo-Perez, Alexander Klein, Antje Alex, Yannik Steiniger, Felix Sattler, Edgardo Solano-Carrillo, Matej Fabijanić, Magdalena Šumunec, Nadir Kapetanović, Andreas Michel, Wolfgang Gross, Martin Weinmann
The 2nd Workshop on Maritime Computer Vision (MaCVi) 2024 addresses maritime computer vision for Unmanned Aerial Vehicles (UAV) and Unmanned Surface Vehicles (USV).
Ranked #1 on
Semantic Segmentation
on LaRS
no code implementations • 27 Jun 2021 • Leon Witt, Usama Zafar, KuoYeh Shen, Felix Sattler, Dan Li, Wojciech Samek
The recent advent of various forms of Federated Knowledge Distillation (FD) paves the way for a new generation of robust and communication-efficient Federated Learning (FL), where mere soft-labels are aggregated, rather than whole gradients of Deep Neural Networks (DNN) as done in previous FL schemes.
1 code implementation • 4 Feb 2021 • Felix Sattler, Tim Korjakow, Roman Rischke, Wojciech Samek
Federated Distillation (FD) is a popular novel algorithmic paradigm for Federated Learning, which achieves training performance competitive to prior parameter averaging based methods, while additionally allowing the clients to train different model architectures, by distilling the client predictions on an unlabeled auxiliary set of data into a student model.
no code implementations • 1 Dec 2020 • Felix Sattler, Arturo Marban, Roman Rischke, Wojciech Samek
Communication constraints are one of the major challenges preventing the wide-spread adoption of Federated Learning systems.
no code implementations • 22 Apr 2020 • Felix Sattler, Jackie Ma, Patrick Wagner, David Neumann, Markus Wenzel, Ralf Schäfer, Wojciech Samek, Klaus-Robert Müller, Thomas Wiegand
Digital contact tracing approaches based on Bluetooth low energy (BLE) have the potential to efficiently contain and delay outbreaks of infectious diseases such as the ongoing SARS-CoV-2 pandemic.
no code implementations • 6 Mar 2020 • Felix Sattler, Thomas Wiegand, Wojciech Samek
Due to their great performance and scalability properties neural networks have become ubiquitous building blocks of many applications.
2 code implementations • 4 Oct 2019 • Felix Sattler, Klaus-Robert Müller, Wojciech Samek
Federated Learning (FL) is currently the most widely adopted framework for collaborative training of (deep) machine learning models under privacy constraints.
1 code implementation • 7 Mar 2019 • Felix Sattler, Simon Wiedemann, Klaus-Robert Müller, Wojciech Samek
Federated Learning allows multiple parties to jointly train a deep learning model on their combined data, without any of the participants having to reveal their local data to a centralized server.
no code implementations • 22 May 2018 • Felix Sattler, Simon Wiedemann, Klaus-Robert Müller, Wojciech Samek
A major issue in distributed training is the limited communication bandwidth between contributing nodes or prohibitive communication cost in general.