Simultaneous Localization and Mapping

163 papers with code • 0 benchmarks • 21 datasets

Simultaneous localization and mapping (SLAM) is the task of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it.

( Image credit: ORB-SLAM2 )

Libraries

Use these libraries to find Simultaneous Localization and Mapping models and implementations

Most implemented papers

Visual-Inertial Monocular SLAM with Map Reuse

ZuoJiaxing/Learn-ORB-VIO-Stereo-Mono 19 Oct 2016

In recent years there have been excellent results in Visual-Inertial Odometry techniques, which aim to compute the incremental motion of the sensor with high accuracy and robustness.

Kimera: an Open-Source Library for Real-Time Metric-Semantic Localization and Mapping

MIT-SPARK/Kimera 6 Oct 2019

We provide an open-source C++ library for real-time metric-semantic visual-inertial Simultaneous Localization And Mapping (SLAM).

ORB-SLAM2: an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras

raulmur/ORB_SLAM2 20 Oct 2016

We present ORB-SLAM2 a complete SLAM system for monocular, stereo and RGB-D cameras, including map reuse, loop closing and relocalization capabilities.

Sparse-to-Dense: Depth Prediction from Sparse Depth Samples and a Single Image

fangchangma/sparse-to-dense 21 Sep 2017

We consider the problem of dense depth prediction from a sparse set of depth measurements and a single RGB image.

Online Spatial Concept and Lexical Acquisition with Simultaneous Localization and Mapping

EmergentSystemLabStudent/SpCoSLAM 15 Apr 2017

We have proposed a nonparametric Bayesian spatial concept acquisition model (SpCoA).

Incremental Visual-Inertial 3D Mesh Generation with Structural Regularities

MIT-SPARK/Kimera 4 Mar 2019

We propose instead to tightly couple mesh regularization and state estimation by detecting and enforcing structural regularities in a novel factor-graph formulation.

Semi-Dense 3D Reconstruction with a Stereo Event Camera

HKUST-Aerial-Robotics/ESVO ECCV 2018

Event cameras are bio-inspired sensors that offer several advantages, such as low latency, high-speed and high dynamic range, to tackle challenging scenarios in computer vision.

LiDARTag: A Real-Time Fiducial Tag System for Point Clouds

UMich-BipedLab/LiDARTag 23 Aug 2019

Because of the LiDAR sensors' nature, rapidly changing ambient lighting will not affect the detection of a LiDARTag; hence, the proposed fiducial marker can operate in a completely dark environment.

3D Dynamic Scene Graphs: Actionable Spatial Perception with Places, Objects, and Humans

MIT-SPARK/Kimera 15 Feb 2020

Our second contribution is to provide the first fully automatic Spatial PerceptIon eNgine(SPIN) to build a DSG from visual-inertial data.