ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume X-2-2024
https://doi.org/10.5194/isprs-annals-X-2-2024-193-2024
https://doi.org/10.5194/isprs-annals-X-2-2024-193-2024
10 Jun 2024
 | 10 Jun 2024

Deep Learning-based DSM Generation from Dual-Aspect SAR Data

Michael Recla and Michael Schmitt

Keywords: Deep Learning, Synthetic Aperture Radar (SAR), 3D Reconstruction, Radargrammetry, DSM Generation

Abstract. Rapid mapping demands efficient methods for a fast extraction of information from satellite data while minimizing data requirements. This paper explores the potential of deep learning for the generation of high-resolution urban elevation data from Synthetic Aperture Radar (SAR) imagery. In order to mitigate occlusion effects caused by the side-looking nature of SAR remote sensing, two SAR images from opposing aspects are leveraged and processed in an end-to-end deep neural network. The presented approach is the first of its kind to implicitly handle the transition from the SAR-specific slant range geometry to a ground-based mapping geometry within the model architecture. Comparative experiments demonstrate the superiority of the dual-aspect fusion over single-image methods in terms of reconstruction quality and geolocation accuracy. Notably, the model exhibits robust performance across diverse acquisition modes and geometries, showcasing its generalizability and suitability for height mapping applications. The study’s findings underscore the potential of deep learning-driven SAR techniques in generating high-quality urban surface models efficiently and economically.