- Pruning Neural Networks: Two Recent Papers (inFERENCe)
- The Lottery Ticket Hypothesis - Paper Recommendation (inFERENCe)
- Model Compression (blog medium)
FIRST
-
Knowledge Distillation:
Distilling the Knowledge in a Neural Network (Hinton et al.):
-
Asynchronous:
-
Asynchronous: