This repo is the official implementation of ICML2022 paper "Learning Domain Adaptive Object Detection with Probabilistic Teacher" by Meilin Chen, Weijie Chen, Shicai Yang, et al. If you have any problem about this work, please feel free to contact Meilin Chen (merlinis-at-zju.edu.cn) or Weijie Chen (chenweijie5-at-hikvision.com).
pip install -r requirements.txt
Follow the INSTALL.md to install Detectron2. We use version: detectron2==0.5
Plz refer to prepare_data.md for datasets preparation.
We used VGG16 pre-trained on ImageNet for all experiments. You can download it to /path/to/project
:
Plz refer to get_started.md for detailed commands.
This code has been further improved, achiving more superior adaptation performance than the results presented in the paper (about +1~2 mAP gains across the tasks, see exps logs for details).
Adaptation Tasks | Methods | Model Weights | mAP50 | Log |
---|---|---|---|---|
CitysScape2FoggyCityscape | PT (ours) | Google Drive | 31 ⇒ 47.1 (+16.1) | Google Drive |
CitysScape2BDD100k | PT (ours) | Google Drive | 26.9 ⇒ 34.9 (+8.0) | Google Drive |
KITTI2CitysScape | PT (ours) | Google Drive | 46.4 ⇒ 60.2 (+13.8) | Google Drive |
Sim10k2CitysScape | PT (ours) | Google Drive | 44.5 ⇒ 55.1 (+10.6) | Google Drive |
If you use Probabilistic Teacher in your research or wish to refer to the results published in the paper, please consider citing our paper:
@inproceedings{chen2022learning,
title={Learning Domain Adaptive Object Detection with Probabilistic Teacher},
author={Chen, Meilin and Chen, Weijie and Yang, Shicai and Song, Jie and Wang, Xinchao and Zhang, Lei and Yan, Yunfeng and Qi, Donglian and Zhuang, Yueting and Xie, Di and others},
booktitle={International Conference on Machine Learning},
pages={3040--3055},
year={2022},
organization={PMLR}
}
This project is released under the Apache 2.0 license. Other codes from open source repository follows the original distributive licenses.
This project is built upon Detectron2 and Unbiased Teacher, and we'd like to appreciate for their excellent works.