Skip to content

catalpaaa/DeMansia-2

Repository files navigation

DeMansia 2

About

DeMansia 2 introduces Mamba 2 to the realm of computer vision, with performance improvements from bidirectional Mamba 2 and token labeling training.

Installation

We provided a simple setup.sh to install the Conda environment. You need to satisfy the following prerequisite:

  • Linux
  • NVIDIA GPU
  • CUDA 12+ supported GPU driver
  • Miniforge

Then, simply run source ./setup.sh to get started.

Pretrained Models

These models were trained on the ImageNet-1k dataset using a single RTX 4090 during our experiments.

Currently, only DeMansia 2 Tiny is available. We will release more models as opportunities arise and continue to improve current models as our training methods advance.

Name Model Dim. Num. of Layers Num. of Param. Input Res. Top-1 Top-5 Batch Size Download Training Log
DeMansia 2 Tiny 192 24 9.5M 224² 79.5% 94.4% 1024 link log

Training and inferencing

To set up the ImageNet-1k dataset, download both the training and validation sets. Use this script to extract and organize the dataset. You should also download and extract the token labeling dataset from here.

We provide train.py, which contains all the necessary code to train a DeMansia 2 model and log the training progress. The logged parameters can be modified in model.py.

The base model's hyperparameters are stored in model_config.py, and you can adjust them as needed. When further training our model, note that all hyperparameters are saved directly in the model file. For more information, refer to PyTorch Lightning's documentation. The same applies to inferencing, as PyTorch Lightning automatically handles all parameters when loading our model.

Here's a sample code snippet to perform inferencing with DeMansia 2:

import torch

from model import DeMansia_2

model = DeMansia_2.load_from_checkpoint("path_to.ckpt")
model.eval()

sample = torch.rand(3, 224, 224) # Channel, Width, Height
sample = sample.unsqueeze(0) # Batch, Channel, Width, Height
pred = model(sample) # Batch, # of class

Credits

Our work builds upon the remarkable achievements of Mamba, and LV-ViT.

module/data and module/token_ce.py are modified from the LV-ViT repo.

module/ema is modified from here.

modules/optimizer.py is taken from here.

Check out the original DeMansia here.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published