Search Results for author: Nikolai Smolyanskiy

Found 6 papers, 3 papers with code

NVRadarNet: Real-Time Radar Obstacle and Free Space Detection for Autonomous Driving

no code implementations29 Sep 2022 Alexander Popov, Patrik Gebhardt, Ke Chen, Ryan Oldja, Heeseok Lee, Shane Murray, Ruchi Bhargava, Nikolai Smolyanskiy

To this end, we present NVRadarNet, a deep neural network (DNN) that detects dynamic obstacles and drivable free space using automotive RADAR sensors.

Autonomous Driving

MVLidarNet: Real-Time Multi-Class Scene Understanding for Autonomous Driving Using Multiple Views

no code implementations9 Jun 2020 Ke Chen, Ryan Oldja, Nikolai Smolyanskiy, Stan Birchfield, Alexander Popov, David Wehr, Ibrahim Eden, Joachim Pehserl

We show that our multi-view, multi-stage, multi-class approach is able to detect and classify objects while simultaneously determining the drivable space using a single LiDAR scan as input, in challenging scenes with more than one hundred vehicles and pedestrians at a time.

Autonomous Driving object-detection +2

On the Importance of Stereo for Accurate Depth Estimation: An Efficient Semi-Supervised Deep Neural Network Approach

4 code implementations26 Mar 2018 Nikolai Smolyanskiy, Alexey Kamenev, Stan Birchfield

Despite the progress on monocular depth estimation in recent years, we show that the gap between monocular and stereo depth accuracy remains large$-$a particularly relevant result due to the prevalent reliance upon monocular cameras by vehicles that are expected to be self-driving.

Autonomous Vehicles Stereo Depth Estimation

Toward Low-Flying Autonomous MAV Trail Navigation using Deep Neural Networks for Environmental Awareness

4 code implementations7 May 2017 Nikolai Smolyanskiy, Alexey Kamenev, Jeffrey Smith, Stan Birchfield

We present a micro aerial vehicle (MAV) system, built with inexpensive off-the-shelf hardware, for autonomously following trails in unstructured, outdoor environments such as forests.

Robotics

Cannot find the paper you are looking for? You can Submit a new open access paper.