ADAPTIVE CONVOLUTIONALLY ENCHANCED BI-DIRECTIONAL LSTM NETWORKS FOR CHOREOGRAPHIC MODELING
Nikolaos Bakalos, Ioannis Rallis, Nikolaos Doulamis, Anastasios Doulamis, Athanasios Voulodimos, Eftychios Protopapadakis
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 11:55
In this paper, we present a deep learning scheme for classification of choreographic primitives from RGB images. The proposed framework combines the representational power of feature maps, extracted by Convolutional Neural Networks, with the long-term dependency modeling capabilities of Long Short-Term Memory recurrent neural networks. In addition, it uses AutoRegressive and Moving Average (ARMA) filter into the convolutionally enriched LSTM filter to face dance dynamic characteristics. Finally, an adaptive weight updating strategy is introduced for improving classification modeling performance. The framework is used for the recognition of dance primitives (basic dance postures) and is experimentally validated with real-world sequences of traditional Greek folk dances.