1 of 7

PDF-Distil: including Prediction Disagreements in Feature-based Distillation for object detection

Heng ZHANG, Elisa FROMONT, Sébastien LEFEVRE, Bruno AVIGNON

IRISA Laboratory, ATERMES Company

{heng.zhang, elisa.fromont, sebastien.lefevre}@irisa.fr bavignon@atermes.fr

2 of 7

Introduction to knowledge distillation

2

Knowledge distillation is a practical technical solution for deep model compression.

The idea is to transfer the learned knowledge from a precise but cumbersome model (teacher) to a compact model (student).

Logits-based and feature-based distillation are the two major knowledge transfer strategies in the literature.

3 of 7

Knowledge distillation on object detection

3

The foreground-background imbalance existing in object detection tasks greatly reduces the efficiency of the knowledge transfer in feature-based distillation.

Previous works assigned distillation weights according to the foreground-background distinction or the feature-mimicking uncertainty.

While discarding the initial motivation of knowledge distillation, which is minimizing the prediction difference between the teacher and the student models.

4 of 7

Our proposed method: PDF-Distil

4

The main contribution of PDF-distil consists in adding a prediction disagreement aware feedback branch in a traditional feature-based detection distillation framework.

5 of 7

Visualization of sampling strategy

5

Our method is capable to adaptively locate challenging areas for the student model to perform object detection, such as unknown objects, reflection in water, object junctions and ambiguous objects.

6 of 7

Experimental results

6

Comparisons with SOTA detection distillation methods on MS COCO

7 of 7

7

Code & trained models

Thanks for your attention

Heng ZHANG, Elisa FROMONT, Sébastien LEFEVRE, Bruno AVIGNON

IRISA Laboratory, ATERMES Company

{heng.zhang, elisa.fromont, sebastien.lefevre}@irisa.fr bavignon@atermes.fr