A FULLY AUTOMATED AND FAST APPROACH FOR CANOPY COVER ESTIMATION USING SUPER HIGH-RESOLUTION REMOTE SENSING IMAGERY
Keywords: plant phenotyping, Unmanned Aerial Vehicle (UAV), machine learning, classification, fractional vegetation cover
Abstract. Canopy cover is a key agronomic variable for understanding plant growth and crop development status. Estimation of canopy cover rapidly and accurately through a fully automated manner is significant with respect to high throughput plant phenotyping. In this work, we propose a simple, robust and fully automated approach, namely a rule-based method, that leverages the unique spectral pattern of green vegetation at visible (VIS) and near-infrared red (NIR) spectra regions to distinguish the green vegetation from background (i.e., soil, plant residue, non-photosynthetic vegetation leaves etc.), and then derive canopy cover. The proposed method was applied to high-resolution hyperspectral and multispectral imagery collected from gantry-based scanner and Unmanned Aerial Vehicle (UAV) platforms to estimate canopy cover. Additionally, machine learning methods, i.e., Support Vector Machine (SVM) and Random Forest (RF) were also employed as bench mark methods. The results show that: the rule-based method demonstrated promising classification accuracies that are comparable to SVM and RF for both hyperspectral and multispectral datasets. Although the rule-based method is more sensitive to mixed pixels and shaded canopy region, which potentially resulted in classification errors and underestimation of canopy cover in some cases; it showed better performance to detect smaller leaves than SVM and RF. Most importantly, the rule-based method substantially outperformed machine learning methods with respect to processing speed, indicating its greater potential for high-throughput plant phenotyping applications.