Sorry, you need to enable JavaScript to visit this website.

In image recognition, knowledge distillation is a valuable approach to train a compact model with high accuracy by exploiting outputs of a highly accurate large model as correct labels. In knowledge distillation, studies have shown the usefulness of data with high entropy output generated by image mix data augmentation techniques. Other strategies such as curriculum learning have also been proposed to improve model generalization by the control of the difficulty of training data over the learning process.

Categories:
34 Views