ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Download
Publications Copernicus
Download
Citation
Articles | Volume X-4/W3-2022
https://doi.org/10.5194/isprs-annals-X-4-W3-2022-181-2022
https://doi.org/10.5194/isprs-annals-X-4-W3-2022-181-2022
14 Oct 2022
 | 14 Oct 2022

ROBUST AND SCALABLE REAL-TIME VEHICLE CLASSIFICATION AND TRACKING: A CASE STUDY OF THAILAND

B. Neupane, T. Horanont, P. Pattarapongsin, and A. Thapa

Keywords: Vehicle Classification, Multi-vehicle Tracking, Intelligent Transport Systems, Spatial Information, YOLOv5

Abstract. An accurate detection, classification, and tracking of vehicles are highly important for intelligent transport systems (ITS) and road maintenance. In recent years, the deep learning (DL)-based approach is highly regarded for real-time vehicle classification from surveillance cameras. However, the practical implementation of such an approach is affected by the adverse lighting conditions and positioning of the cameras. In this research, we develop a DL-based method for near real-time multi-vehicle counting, classifying, and tracking on individual lanes of the road. First, we train a DL network of the You Only Look Once (YOLO) family on a custom dataset that we have curated. The dataset consists of nearly 30000 training samples to classify the vehicles into seven classes, which is more than in the existing benchmark datasets. Second, we fine-tune the trained model into another small dataset collected from the surveillance cameras that are used during the implementation process. Third, we connect the trained model to a tracking algorithm that we have developed to produce a per-lane report with the calculation of the speed and mobility of the vehicles. We test the robustness of the system on different faces of the vehicles and in adverse lighting conditions. The overall accuracy (OA) of classification ranges from 91% to 99% in four faces of vehicles (back, front, driver side, and passenger side). Similarly, in an experiment on adverse lighting conditions, OA of 93.7% and 99.6% is observed in a noisy and clear lighting conditions respectively. The implications of these results will assist in road maintenance with spatial information management and sensing for intelligent transport planning.