Abstract
Knowledge Distillation (KD) aims at distilling the knowledge from the large teacher model to a lightweight student model. Mainstream KD methods can be......
小提示:本篇文献需要登录阅读全文,点击跳转登录