Sat-SINR: High-Resolution Species Distribution Models Through Satellite Imagery
Keywords: species distribution modeling, deep learning, sentinel-2, multi-modal fusion
Abstract. We propose a deep learning approach for high-resolution species distribution modelling (SDM) at large scale combining point-wise, crowd-sourced species observation data and environmental data with Sentinel-2 satellite imagery. What makes this task challenging is the great variety of controlling factors for species distribution, such as habitat conditions, human intervention, competition, disturbances, and evolutionary history. Experts either incorporate these factors into complex mechanistic models based on presence-absence data collected in field campaigns or train machine learning models to learn the relationship between environmental data and presence-only species occurrence. We extend the latter approach here and learn deep SDMs end-to-end based on point-wise, crowd-sourced presence-only data in combination with satellite imagery. Our method, dubbed Sat-SINR, jointly models the spatial distributions of 5.6k plant species across Europe and increases the spatial resolution by a factor of 100 compared to the current state of the art. We exhaustively test and ablate multiple variations of combining geo-referenced point data with satellite imagery and show that our deep learning-based SDM method consistently shows an improvement of up to 3 percentage points across three metrics. We make all code publicly available at https://github.com/ecovision-uzh/sat-sinr
.