Submitted by fredlafrite t3_106no9h in MachineLearning
nmfisher t1_j3l4ipq wrote
Reply to comment by suflaj in [D] Have you ever used Knowledge Distillation in practice? by fredlafrite
Echoing this, KD is also very useful for taking a heavyweight GPU model and training a student model that's light enough to run on mobile. Small sacrifice in quality for huge performance gains.
fredlafrite OP t1_j3l65ju wrote
Interesting! Echoing this, do you know which kind of companies one could work on this in an applied setting?
Think_Olive_1000 t1_j3qynuz wrote
Neural magic does work in this space, not sure about KD specifically
Viewing a single comment thread. View all comments