Search Results for author: Nick Schneider

Found 5 papers, 1 papers with code

Semantically Guided Depth Upsampling

no code implementations2 Aug 2016 Nick Schneider, Lukas Schneider, Peter Pinggera, Uwe Franke, Marc Pollefeys, Christoph Stiller

We present a novel method for accurate and efficient up- sampling of sparse depth data, guided by high-resolution imagery.

Edge Detection Scene Labeling

RegNet: Multimodal Sensor Registration Using Deep Neural Networks

no code implementations11 Jul 2017 Nick Schneider, Florian Piewak, Christoph Stiller, Uwe Franke

In this paper, we present RegNet, the first deep convolutional neural network (CNN) to infer a 6 degrees of freedom (DOF) extrinsic calibration between multimodal sensors, exemplified using a scanning LiDAR and a monocular camera.

Translation

Sparsity Invariant CNNs

1 code implementation22 Aug 2017 Jonas Uhrig, Nick Schneider, Lukas Schneider, Uwe Franke, Thomas Brox, Andreas Geiger

In this paper, we consider convolutional neural networks operating on sparse inputs with an application to depth upsampling from sparse laser scan data.

Depth Completion Depth Estimation +1

Boosting LiDAR-based Semantic Labeling by Cross-Modal Training Data Generation

no code implementations26 Apr 2018 Florian Piewak, Peter Pinggera, Manuel Schäfer, David Peter, Beate Schwarz, Nick Schneider, David Pfeiffer, Markus Enzweiler, Marius Zöllner

The effectiveness of the proposed network architecture as well as the automated data generation process is demonstrated on a manually annotated ground truth dataset.

Autonomous Vehicles

Learning Cascaded Detection Tasks with Weakly-Supervised Domain Adaptation

no code implementations9 Jul 2021 Niklas Hanselmann, Nick Schneider, Benedikt Ortelt, Andreas Geiger

In order to handle the challenges of autonomous driving, deep learning has proven to be crucial in tackling increasingly complex tasks, such as 3D detection or instance segmentation.

Autonomous Driving Domain Adaptation +2

Cannot find the paper you are looking for? You can Submit a new open access paper.