Skyline matching based camera orientation from images and mobile mapping point clouds
Keywords: Laser scanning, Mobile Mapping, Point cloud, Camera, Registration, Matching, Automation
Abstract. Mobile Mapping is widely used for collecting large amounts of geo-referenced data. An important role plays sensor fusion, in order to evaluate multiple sensors such as laser scanner and cameras jointly. This requires to determine the relative orientation between sensors. Based on data of a RIEGL VMX-250 mobile mapping system equipped with two laser scanners, four optional cameras, and a highly precise GNSS/IMU system, we propose an approach to improve camera orientations. A manually determined orientation is used as an initial approximation for matching a large number of points in optical images and the corresponding projected scan images. The search space of the point correspondences is reduced to skylines found in both the optical as well as the scan image. The skyline determination is based on alpha shapes, the actual matching is done via an adapted ICP algorithm. The approximate values of the relative orientation are used as starting values for an iterative resection process. Outliers are removed at several stages of the process. Our approach is fully automatic and improves the camera orientation significantly.