- Read more about Supplementary Material for Effective relationship between characteristics of training data and learning progress on knowledge distillation
- Log in to post comments
In image recognition, knowledge distillation is a valuable approach to train a compact model with high accuracy by exploiting outputs of a highly accurate large model as correct labels. In knowledge distillation, studies have shown the usefulness of data with high entropy output generated by image mix data augmentation techniques. Other strategies such as curriculum learning have also been proposed to improve model generalization by the control of the difficulty of training data over the learning process.
- Categories:
34 Views