Skip to content
/ RAL Public

[ICCVW2023] Robust Asymmetric Loss for Multi-Label Long-Tailed Learning

Notifications You must be signed in to change notification settings

kalelpark/RAL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Robust Asymmetric Loss for Multi-Label Long-Tailed Learning

Robust Asymmetric Loss for Multi-Label Long-Tailed Learning
Wongi Park, Inhyuk Park, Sungeun Kim, Jongbin Ryu.
(CVAMD workshop at International Conference on Computer Vision (ICCVW), 2023 ),

model overview

Environment Setting

  • Conda environment : Ubuntu 18.04 CUDA-10.1 (10.2) with Pytorch==1.13.0, Torchvision==0.6.0 (python 3.8), libauc, torchmetrices==0.8.0.
# Create Environment
conda create -n ral python=3.8
conda activate ral

# Install pytorch, torchvision, cudatoolkit
conda install pytorch==1.13.0 torchvision==0.6.0 libauc==1.3.0 cudatoolkit=10.1 (10.2) -c pytorch

Benchmark Dataset

  • How to get dataset?
  1. MIMIC-CXR 2.0 : MIMIC-CXR 2.0
  2. APTOS 2019 Blindness : APTOS2019
  3. ISIC2018 Challenge : ISIC2018
  • Directory structure of our project
- Directory
  - run.sh                : shell Script Version (train, infer)
  - main.py               : Main Execution (Args, Setting)
  - dataset               : Augmentation, DataLoader
      - ...
  - train.py               : training, validation
  - predict.py             : inference   
      - ...
  - utils                 : Distribution Setting, Metrics
      - ...

Supporting papers

(1) Focal Loss for Dense Object Detection (Paper / Code)

(2) Asymmetric Loss For Multi-Label Classification (Paper / Code)

(3) Simple and Robust Loss Design for Multi-Label Learning with Missing Labels (Paper / Code)

Training & Inference

Train
torchrun --nproc_per_node=8 main.py --gpu_ids 0,1,2,3,4,5,6,7 --seed 0 --train 1 --model convnext --batchsize 64 --epochs 30
Inference
torchrun --nproc_per_node=8 main.py --gpu_ids 0,1,2,3,4,5,6,7 --seed 0 --img_size 1024 --infer 1 --model convnext --batchsize 20 --epochs 200 --store_name fold --save_model 1

The results will be automatically saved in ./workspace/[model name]/[Workspace Name].

How to cite

@article = {
    title = {Robust Asymmetric Loss for Multi-Label Long-Tailed Learning},
    author = {Wongi Park, Inhyuk Park, Sungeun Kim, Jongbin Ryu},
    Paper = {CVAMD workshop at International Conference on Computer Vision (ICCV)},
    url = {https://arxiv.org/abs/2308.05542},
    year = {2023},
}

About

[ICCVW2023] Robust Asymmetric Loss for Multi-Label Long-Tailed Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published