Detecting A Child?S Stimming Behaviours For Autism Spectrum Disorder Diagnosis Using Rgbpose-Slowfast Network
Jeba Berlin S, Deepak Pandian, Shyam Sundar Rajagopalan, DineshBabu Jayagopi
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:11:01
Network pruning enables the utilization of deep neural networks in low-resource environments by removing redundant elements in a pre-trained network. To appraise each pruning method, two evaluation metrics are generally adopted, i.e., final accuracy and accuracy drop. Final accuracy represents the ultimate performance of the pruned sub-network after the pruning completes. On the other hand, accuracy drop, a more traditional way, measures the accuracy difference between the baseline model and the final pruned model. in this work, we present several surprising observations which reveal the unfairness of both metrics when assessing the efficacy of pruning approaches. Depending on the choice of baseline network, the value of each pruning method may be completely changed. Specifically, a lower baseline tends to be advantageous for the accuracy drop, whereas a higher baseline usually yields a higher final accuracy. Moreover, to reduce the undesirable dependency on the baseline network, we propose a new reliable averaging method Average from Scratches which uses multiple distinct baselines rather than using a single baseline. Our various investigations point to the necessity for a more thorough analysis on network pruning metrics.