ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Share
Publications Copernicus
Download
Citation
Share
Articles | Volume X-2/W2-2025
https://doi.org/10.5194/isprs-annals-X-2-W2-2025-31-2025
https://doi.org/10.5194/isprs-annals-X-2-W2-2025-31-2025
29 Oct 2025
 | 29 Oct 2025

Ending Overfitting for UAV Applications - Self-Supervised Pretraining on Multispectral UAV Data

Jurrian Doornbos and Önder Babur

Keywords: foundation models, self-supervised learning, drones, UAV, multispectral

Abstract. While UAVs have revolutionized data collection for remote sensing, the practical application of Deep Learning remains severely limited by the scarcity of labelled training data, creating a stark contrast between laboratory successes and field performance. This research investigates whether transfer learning techniques can overcome this "small data problem" by enabling UAV-based deep learning models to generalize effectively across diverse environments without requiring prohibitive amounts of labelled examples. We present the use of an efficient self-supervised learning framework (FastSiam) tailored specifically for multispectral UAV imagery to overcome this generalization gap. Our approach enables effective feature learning without requiring extensive labelled data, bridging the gap between the potential of foundation models and the resource constraints of UAV remote sensing applications. We evaluate our method on a vineyard segmentation task across multiple geographic locations, demonstrating that models with FastSiam pretrained backbones significantly outperform their end-to-end trained counterparts, even with extremely limited labelled data. The most sophisticated architecture tested, Swin-T with a pretrained backbone, achieved an average F1 score of 0.80 across diverse test sites, showcasing robust generalization capabilities. Importantly, our results show that pretrained models benefit more from diversity in training samples than from sheer volume, suggesting new pathways for efficient model development in UAV applications. This work establishes that self-supervised pretraining serves as an effective regularizer for remote sensing tasks. Pretraining limits overfitting and improves generalization across varying environmental conditions, whilst requiring only modest computational resources, making advanced Deep Learning techniques more accessible for practical UAV applications.

Share