CYEDA: Cycle-Object Edge Consistency Domain Adaptation
Jing Chong Beh, Kam Woh Ng, Jie Long Kew, Che-Tsung Lin, Chee Seng Chan, Shang-Hong Lai, Christopher Zach
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:12:27
Conventional network pruning methods require multiple stages to identify and train a single compact pruned model. This approach has a high computational overhead and requires multiple iterations to train multiple pruned models increasing the total computational cost. in this work, we present a single-stage Random Pruning Online Distillation (RAPID) framework to identify and train multiple pruned models at once. We randomly prune the original network at scratch and then leverage distillation at the later training epochs to transfer information from the original network(teacher) to multiple pruned models(students) online. Extensive experiments demonstrate the effectiveness of the RAPID framework on several datasets and architectures when compared with other pruning methods.