ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume X-2-2024
https://doi.org/10.5194/isprs-annals-X-2-2024-129-2024
https://doi.org/10.5194/isprs-annals-X-2-2024-129-2024
10 Jun 2024
 | 10 Jun 2024

Uncertainty-aware Cross-Entropy for Semantic Segmentation

Steven Landgraf, Markus Hillemann, Kira Wursthorn, and Markus Ulrich

Keywords: Semantic Segmentation, Uncertainty Quantification, Monte Carlo Dropout, Vision Metrology

Abstract. Deep neural networks have shown exceptional performance in various tasks, but their lack of robustness, reliability, and tendency to be overconfident pose challenges for their deployment in safety-critical applications like autonomous driving. In this regard, quantifying the uncertainty inherent to a model’s prediction is a promising endeavour to address these shortcomings. In this work, we present a novel Uncertainty-aware Cross-Entropy loss (U-CE) that incorporates dynamic predictive uncertainties into the training process by pixel-wise weighting of the well-known cross-entropy loss (CE). Through extensive experimentation, we demonstrate the superiority of U-CE over regular CE training on two benchmark datasets, Cityscapes and ACDC, using two common backbone architectures, ResNet-18 and ResNet-101. With U-CE, we manage to train models that not only improve their segmentation performance but also provide meaningful uncertainties after training. Consequently, we contribute to the development of more robust and reliable segmentation models, ultimately advancing the state-of-the-art in safety-critical applications and beyond.