UAV-Based Collaborative Mapping Framework with Environmental Semantic Extraction
Keywords: UAV, SLAM, Collaborative Mapping, 3D LiDAR, Environmental Semantics
Abstract. The evolving low-altitude economy enables Unmanned Aerial Vehicles (UAVs) to gather diverse sensor data, including RGB images, 3D point clouds, and inertial measurements, offering untapped potential for environmental mapping. Traditional urban 3D modeling methods often face delays in updates and scalability issues. This paper introduces a novel UAV-based collaborative mapping framework that integrates heterogeneous data from multiple UAVs to efficiently reconstruct dynamic urban environments. The framework employs advanced visual recognition and optical character recognition (OCR) for semantic feature extraction, complemented by LiDAR inertial odometry for precise map construction. To address challenges posed by sparse LiDAR data in indoor settings, a temporal alignment mechanism is employed to generate synchronized keyframes, enhancing data coherence. Additionally, camera-LiDAR calibration combined with cross-modal registration and semantic-guided point cloud stitching boosts system robustness. Experimental results demonstrate that feature-guided point cloud registration, bolstered by semantic alignment, surpasses traditional methods, achieving efficient mapping and offering a scalable solution for urban 3D modeling, applicable in real-time urban planning and smart city development.
