ADAPTABLE AUTOREGRESSIVE MOVING AVERAGE FILTER TRIGGERING CONVOLUTIONAL NEURAL NETWORKS FOR CHOREOGRAPHIC MODELING
Keywords: Deep learning, Dynamic Scene Analysis, Intangible Cultural Heritage, Choreographic Modeling
Abstract. Choreographic modeling, that is identification of key choreographic primitives, is a significant element for Intangible Cultural Heritage (ICH) performing art modeling. Recently, deep learning architectures, such as LSTM and CNN, have been utilized for choreographic identification and modeling. However, such approaches present sensitivity to capturing errors and fail to model the dynamic characteristics of a dance, since they assume a stationarity between the input-output data. To address these limitations, in this paper, we introduce an AutoRegressive Moving Average (ARMA) filter into a conventional CNN model; this means that the classification output feeds back to the input layer, improving overall classification accuracy. In addition, an adaptive implementation algorithm is introduced, exploiting a first-order Taylor series expansion, to update network response in order to fit dance dynamic characteristics. This way, the network parameters (e.g., weights) are dynamically modified improving overall classification accuracy. Experimental results on real-life dance sequences indicate the out-performance of the proposed approach with respect to conventional deep learning mechanisms.