Skip to main content
ieee top header menu
IEEE.org
IEEE Xplore Digital Library
IEEE Standards
IEEE Spectrum
More Sites
Anonymous user menu
Cart
Create Account
Sign In
Conferences
Education
Member Resources
Publications
Leave this field blank
Search
IEEE Resource Center
IEEE.org
feature distillation
Apply
12 May 2022
Compressing Transformer-based ASR Model by Task-driven Loss and Attention-based Multi-level Feature Distillation
SPS
Members:
Free
IEEE Members:
$11.00
Non-members:
$15.00
Subscribe to feature distillation