Skip to content

This repository contains some of the latest data augmentation techniques and optimizers for image classification using pytorch and the CIFAR10 dataset

License

Notifications You must be signed in to change notification settings

etetteh/sota-data-augmentation-and-optimizers

Repository files navigation

sota-data-augmentation-and-optimizers

This repository contains some of the latest data augmentation techniques and optimizers for image classification using pytorch and the CIFAR10 dataset

Update (Tue January, 14 2020)

The main update was adding efficient-b4 described in Efficient and made it the main model for the bash scripts. The sample notebooks were modified to make use of the custom model. Also added the link to the pre-trained weights of efficientnet-b4, located in checkpoint

Data Augmentation Technique

This repository implements the following data augmentation techniques. The links to the papers and pytorch code references are associated accordingly (some with slight modification).

CutOut (Improved Regularization of Convolutional Neural Networks with Cutout, code)

AutoAugment (AutoAugment: Learning Augmentation Policies from Data, code)

RandAugment (RandAugment: Practical automated data augmentation with a reduced search space, code)

AugMix (AugMix: A Simple Data Processing Method to Improve Robustness and Uncertainty, code)

ISDA (Implicit Semantic Data Augmentation for Deep Networks, code)

Architecture (sample notebooks)

Custom convolutional neural network model that makes use of depthwise convolution and squeeze-and-excitation and the mish activation function.

You may use any model of choice, but slight modification is needed in order to implement ISDA.

Optimizers

AdaMod

DeepMemory (code)

Ranger (RAdam with LookAhead)

AdaLook (AdaMod with LookAhead (mine))

Loss function

I made use of cross entropy loss with label smoothing as implemented here

Installation

Install Python 3 with anaconda

$ git clone https://github.com/etetteh/sota-data-augmentation-and-optimizers
$ cd sota-data-augmentation-and-optimizers
$ pip install -r requirements.txt

Implementation

Sample Notebook

The sample_notenook.ipynb and isda_sample_notebook.ipynb contains code to play with the various augmentation and optimizer techniques for non-isda and isda implementations. Simply comment or uncomment appropriate lines.

The train.ipynb contains sample bash scripts.

Scripts

You may run the following line of code in your bash terminal

$ python main.py -h
usage: main.py [-h] [--cutout] [--autoaug] [--randaug] [--augmix] [--adamod]
           [--adalook] [--deepmemory] [--ranger] [--resume] [--path PATH]
           [--epochs EPOCHS] [--num_workers NUM_WORKERS]
           [--batch_size BATCH_SIZE] [--learning_rate LEARNING_RATE]
           [--weight_decay WEIGHT_DECAY] [--print_freq PRINT_FREQ]
           [--seed SEED]

Data Augmentation Techniques on CIFAR10 with PyTorch.

optional arguments:
-h, --help            show this help message and exit
--cutout              Using CutOut data augmentation technique.
--autoaug             Using AutoAugment data augmentation technique.
--randaug             Using RandAugment data augmentation technique.
--augmix              Using AugMixt data augmentation technique.
--adamod              Use AdaMod optimizer
--adalook             Use AdaMod+LookAhead optimizer
--deepmemory          Use DeepMemory optimizer
--ranger              Use RAdam+LookAhead optimizer
--resume, -r          resume training from checkpoint.
--path PATH           path to checkpoint. pass augmentation name
--epochs EPOCHS, -e EPOCHS
                    Number of training epochs.
--num_workers NUM_WORKERS
                    Number of CPUs.
--batch_size BATCH_SIZE, -bs BATCH_SIZE
                    input batch size for training.
--learning_rate LEARNING_RATE, -lr LEARNING_RATE
                    learning rate.
--weight_decay WEIGHT_DECAY, -wd WEIGHT_DECAY
                    weight decay.
--print_freq PRINT_FREQ, -pf PRINT_FREQ
                    Number of iterations to print out results
--seed SEED           random seed

Example: To train using adamod optimizer and augmix augmentation for 100 epochs, run:

$ python main.py --adamod --augmix --epochs 100

To resume training, run:

$ python main.py --resume --adamod --augmix --epochs 100

To train using isda, adamod optimizer and augmix augmentation for 100 epochs, run:

$ python main_isda.py --adamod --augmix --epochs 100

Coming Soon...

Results with my custom model and other models.

To do

CutMix (CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features, code)

AdvProp (Adversarial Examples Improve Image Recognition)

Memory Efficient Mish

About

This repository contains some of the latest data augmentation techniques and optimizers for image classification using pytorch and the CIFAR10 dataset

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published