Multimodal Fusion for Deforestation Detection: Integrating Weather and Satellite Alerts with Deep Learning
Keywords: deforestation, multimodal fusion, deep learning, weather data, satellite alerts
Abstract. This paper presents a multimodal deep learning framework for deforestation detection that integrates satellite-based deforestation alerts (DETER) with weather and atmospheric variables (WF). While we hypothesized that WF could provide complementary signals for short-term deforestation prediction, our experiments show that fusion models provide only isolated and modest gains, with no consistent improvement over DETER-only baselines. The WF-only model highlights structurally vulnerable regions but lacks precision in identifying which specific pixels will be deforested—an ability retained by the alert-based models. Our findings confirm the dominant role of satellite alerts for precise deforestation monitoring, with WF signals offering limited added value for operational systems.
