ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Share
Publications Copernicus
Download
Citation
Share
Articles | Volume X-3/W4-2025
https://doi.org/10.5194/isprs-annals-X-3-W4-2025-15-2026
https://doi.org/10.5194/isprs-annals-X-3-W4-2025-15-2026
13 Mar 2026
 | 13 Mar 2026

Automatic urban trees detection from airborne LiDAR data using 3D descriptor and intensity

Cleber Junior Alencar, Mauricio Galo, and Renato César dos Santos

Keywords: Urban forests, Point cloud processing, LiDAR, Photogrammetry, Remote sensing

Abstract. Urban trees play an important role for improving city liveability, as they help reduce heat, air pollution, flood risk, while supporting a balanced and sustainable microclimate. Thus, detecting and monitoring urban trees are vital for the effective management and environmental conservation of cities. Traditional remote sensing methods rely on imagery from optical sensors, but they face limitations in capturing inner tree structural information. In this context, LiDAR (Light Detection And Ranging) data can be a suitable alternative. Although point-cloud based approaches explore directly the three-dimensional (3D) information inherent in raw LiDAR data, the effectiveness of 3D descriptors and intensity values for tree detection remains underexplored, particularly in heterogenous urban environments with mixed trees compositions. This work introduces an automatic and unsupervised approach for urban tree detection from airborne LiDAR data, combining intensity information with the omnivariance, a 3D descriptor calculated from eigenvalues. A two-step K-means clustering method is applied – first to identify potential tree points using intensity, then to detect actual trees using the omnivariance feature – followed by morphological guided filtering to reduce misclassification. The tests were carried out on six different areas selected in datasets from Brazil and New Zealand. The evaluation was based on manually labelled reference data. The obtained results reveal an overall accuracy of 89% and low omission errors (6%), indicating method’s robustness across varied urban scenarios.

Share