Multi-sensor Fusion SLAM from the Nadir View for UAV Localization and Mapping
Keywords: UAV, Multi-sensor fusion SLAM, Nadir view, State estimation, Factor graph
Abstract. A single sensor on unmanned aerial vehicles (UAVs) cannot provide stable and accurate trajectory prediction in outdoor low-altitude environments. Moreover, most UAV datasets primarily focus on the low-altitude forward-facing view, with limited coverage of the nadir view. To solve this problem, this study presents a multi-sensor fusion SLAM for UAV localization and mapping from the nadir view in real-time. This method integrates monocular images, IMU measurements, and GNSS coordinates, combining the advantages of each sensor to achieve accurate and reliable state estimation. First, the sensors are initialized and aligned to ensure a consistent reference frame. Subsequently, tracking and local mapping are conducted to establish the system’s midterm stability. Finally, the optimization function is formulated using a factor graph that integrates visual factor, inertial factor, GNSS factor, keyframe proximity factor, and designed yaw factor. The system is evaluated using the MARS dataset, and the experimental results demonstrate improved drift reduction and enhanced positioning accuracy.
