ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Publications Copernicus
Articles | Volume V-1-2022
17 May 2022
 | 17 May 2022


W. Zhang, S. Wang, and N. Haala

Keywords: Visual SLAM, Dense Reconstruction, Mutil-Sensor Fusion, TSDF Map, Mobile Robot

Abstract. Mobile robots are being increasingly employed in various indoor scenarios. The fundamental prerequisite is that the robot can reconstruct an accurate and complete map of the observed environment and estimate the track of its movements in this map. Current visual SLAM methods can perform this task reasonably well, but mostly in small spaces, such as a single room, and often tested in well-textured environments. In real-world applications of large indoor scenes, they lack robustness and fail to build a globally consistent map. To this end, we propose a novel system that can robustly solve the problem encountered by existing visual SLAM methods, such as weak texture and long-term drift. By combining information from a wheel odometer, the robot poses can be predicted smoothly in the absence of texture. The geometric cues are leveraged by aligning Truncated Signed Distance Function (TSDF) based submaps to minimize the long-term drift. To reconstruct a more complete and accurate dense map, we refine the sensor depth maps by taking advantage of color information and the optimization result of global bundle adjustment. As a result, the system can provide precise trajectory estimation and a globally consistent map for the downstream tasks. We validate the accuracy and robustness of the proposed method on both public and self-collected datasets and show the complementary nature of each module. Evaluation results based on high precision ground-truth show an improvement in the mean Absolute Trajectory Error (ATE) from 21 cm to 2 cm for the trajectory estimation, and the reconstructed map has a mean accuracy of 8 cm.