Paper: Complementary Perception for Handheld SLAM (April 2018)
Paper ICRA2018: Complementary Perception for Handheld SLAM
We present a novel method for mapping general three-dimensional environments, where sufficient geometric or visual information is not everywhere guaranteed and where the device motion is unconstrained as with handheld systems.
The continuous-time simultaneous localization and mapping algorithm integrates a lidar, camera, and inertial measurement unit in a complementary fashion whereby all sensors contribute constraints to the optimization.
The proposed algorithm is designed to expand the domain of mappable environments and therefore increase the reliability and utility of general purpose mobile mapping.
A key component of the proposed algorithm is the incorporation of depth uncertainty into visual features, which is effective for noisy surfaces and allows features with and without depth estimates to be modeled in a unified manner.
Results demonstrate a wider mappable domain on challenging environments compared to the state-of-the-art lidar or vision-based localization and mapping algorithms.
Coming up next: