pytorch=1.7.1=py3.8_cuda10.1.243_cudnn7.6.3_0
torchvision=0.8.2=py38_cu101
monai-weekly==0.9.dev2152
nibabel==3.2.1
omegaconf==2.1.1
timm==0.4.12
- Install PyTorch, timm and MONAI.
- Download the BTCV and MSD_BraTS data.
- Install Wandb for logging and visualizations.
The run scripts are in directory scripts
python main.py \
configs/mae3d_btcv_1gpu.yaml \
--mask_ratio=0.125 \
--run_name='mae3d_sincos_vit_base_btcv_mr125'
The default configurations are set in configs/mae3d_btcv_1gpu.yaml
. You can overwrite the configurations by passing arguments with the corresponding key names through the command line, e.g., mask_ratio
. We use Wandb to monitor the training process and visualize the masked reconstruction. During the training, the output including checkpoints and Wandb local files are all stored in the specified output_dir
value in the configurations.
The core MAE codes locate in lib/models/mae3d.py
.
The run scripts is in directory scripts
python main.py \
configs/unetr_btcv_1gpu.yaml \
--lr=3.44e-2 \
--batch_size=6 \
--run_name=unetr3d_vit_base_btcv_lr3.44e-2_mr125_10ke_pretrain_5000e \
--pretrain=$YOUR Pre-Trained MAE Checkpoint$
The core UNETR codes locate in lib/models/unetr3d.py
.