1 of 26

Source-free Domain Adaptation

Xin Zhao

CVML Lab@SMU Group Seminar

25 February 2022

2 of 26

2

Unsupervised Domain Adaptation (UDA)

Transfer the knowledge from the labeled source domain to an unlabeled target domain to learn an accurate model for target data.

Domain Shift

  • Main methods for UDA
  • Adversarial Training
  • Discrepancy-based methods

3 of 26

3

Unsupervised Domain Adaptation (UDA)

Conventional UDA methods normally assume that the source data is available during training.

  • Limitations
  • The source datasets like videos or high-resolution images may be so large that it is not practical or convenient to transfer or retain them to different platforms.
  • Many datasets are withheld due to privacy concerns.

4 of 26

4

Source-free Domain Adaptation (SFDA)

    • Adapt source-trained model towards target domain without accessing source data.
  • Definition

5 of 26

5

Two Papers

  • Source Hypothesis Transfer (SHOT) ICML 2020
  • Historical Contrastive Learning (HCL) NeurIPS 2021

6 of 26

6

SHOT: Source Model Generation

  • Classification loss:

7 of 26

7

SHOT: Motivation

  • What can we learn from the source model?

8 of 26

8

SHOT: Source Hypothesis Transfer

  • Framework

9 of 26

9

SHOT: Source Hypothesis Transfer

  • Information Maximization (IM)

10 of 26

10

SHOT: Source Hypothesis Transfer

  • Information Maximization (IM)

11 of 26

11

SHOT: Source Hypothesis Transfer

  • Self-supervised Pseudo-labeling
  • Exploit target-specific centroids to obtain accurate pseudo labels.

12 of 26

12

SHOT: Source Hypothesis Transfer

  • Complete objective

13 of 26

13

SHOT: Classification Results

  • Office-Home
  • Digits
  • Observations
  • SHOT and SHOT-IM outperforms compared UDA methods.
  • The proposed self-supervised pseudo label method is

very helpful.

14 of 26

14

SHOT: Analysis

  • Ablation Study
  • Observations
  • The diversity term is very important.

15 of 26

15

SHOT: Analysis

  • Ablation Study
  • Observations
  • Self-supervised PL is more helpful than naïve PL
  • The diversity term is very important.

16 of 26

16

Two Papers

  • Source Hypothesis Transfer (SHOT) [ICML 2020]
  • Historical Contrastive Learning (HCL) [NeurIPS 2021]

17 of 26

17

HCL

  • Main Idea: learn instance-discriminative and category-discriminative representations without forgetting

source-domain hypothesis.

  • Two main components:
  • Historical Contrastive Instance Discrimination (HCID)
  • Historical Contrastive Category Discrimination (HCCD)

18 of 26

18

HCL: Historical Contrastive Instance Discrimination (HCID)

  • Main idea: learn from unlabeled target samples via contrastive learning over their embeddings generated from

current (as queries) and historical models (as keys): the positive pairs are pulled close while negative pairs are

pushed apart.

  • Compared with InfoNCE
  • Reliability term.
  • Key generation.

19 of 26

19

  • HCCD loss
  • Main idea: re-weight pseudo labels by historical consistency, i.e., the prediction consistency across the current

and historical models.

HCL: Historical Contrastive Category Discrimination (HCCD)

20 of 26

20

HCL: Semantic Segmentation Results

  • Observations

21 of 26

21

HCL: Semantic Segmentation Results

  • Observations
  • HCL outperforms state-of-the-art SFDA methods

22 of 26

22

HCL: Semantic Segmentation Results

  • Observations
  • HCL outperforms state-of-the-art SFDA methods
  • HCL achieves competitive performance as compared with state-of-the-art UDA methods

23 of 26

23

HCL: Semantic Segmentation Results

  • Observations
  • HCL is complementary to existing SFDA methods
  • HCL outperforms state-of-the-art SFDA methods
  • HCL achieves competitive performance as compared with state-of-the-art UDA methods

24 of 26

24

HCL: Semantic Segmentation Results

  • Observations
  • Both HCID and HCCD are helpful for the performance
  • HCL is complementary to existing SFDA methods
  • HCL outperforms state-of-the-art SFDA methods
  • HCL achieves competitive performance as compared with state-of-the-art UDA methods

25 of 26

25

HCL: Image Classification Results

  • VisDA
  • Office31

26 of 26

26

Thank you! Any questions?