Skip to content
forked from leftthomas/SimCLR

A PyTorch implementation of SimCLR based on ICML 2020 paper "A Simple Framework for Contrastive Learning of Visual Representations"

Notifications You must be signed in to change notification settings

gatechke/SimCLR

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

76 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SimCLR

A PyTorch implementation of SimCLR based on ICML 2020 paper A Simple Framework for Contrastive Learning of Visual Representations.

Network Architecture image from the paper

Requirements

conda install pytorch torchvision cudatoolkit=10.0 -c pytorch
  • thop
pip install thop

Dataset

CIFAR10 dataset is used in this repo, the dataset will be downloaded into data directory by PyTorch automatically.

Usage

Train SimCLR

python main.py --batch_size 1024 --epochs 1000 
optional arguments:
--feature_dim                 Feature dim for latent vector [default value is 128]
--temperature                 Temperature used in softmax [default value is 0.5]
--k                           Top k most similar images used to predict the label [default value is 200]
--batch_size                  Number of images in each mini-batch [default value is 512]
--epochs                      Number of sweeps over the dataset to train [default value is 500]

Linear Evaluation

python linear.py --batch_size 1024 --epochs 200 
optional arguments:
--model_path                  The pretrained model path [default value is 'results/128_0.5_200_512_500_model.pth']
--batch_size                  Number of images in each mini-batch [default value is 512]
--epochs                      Number of sweeps over the dataset to train [default value is 100]

Results

There are some difference between this implementation and official implementation, the model (ResNet50) is trained on one NVIDIA TESLA V100(32G) GPU:

  1. No Gaussian blur used;
  2. Adam optimizer with learning rate 1e-3 is used to replace LARS optimizer;
  3. No Linear learning rate scaling used;
  4. No Linear Warmup and CosineLR Schedule used.
Evaluation Protocol Params (M) FLOPs (G) Feature Dim Batch Size Epoch Num τ K Top1 Acc % Top5 Acc % Download
KNN 24.62 1.31 128 512 500 0.5 200 89.1 99.6 model | gc5k
Linear 23.52 1.30 - 512 100 - - 92.0 99.8 model | f7j2

About

A PyTorch implementation of SimCLR based on ICML 2020 paper "A Simple Framework for Contrastive Learning of Visual Representations"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%