This repository has been archived by the owner on Mar 17, 2021. It is now read-only.
Releases: NifTK/NiftyNet
Releases · NifTK/NiftyNet
0.6.0
Added
- isotropic random scaling option
- volume padding with user-specified constant
- subpixel layer for superresolution
- various loss functions for regression (smooth L1 loss, cosine loss etc.)
- handler for early stopping mechanism
- aggregator with multiple outputs including labels in CSV
- nnUNet, an improved version of UNet3D
- data augmentation with mixup and mixmatch
- documentation contents
- demo for learning rate scheduling
- demo for deep boosted regression
- initial integration of NiftyReg Resampler
- initial integration of CSV reader
Fixed
- issue of loading binary values of NIfTI file
- various fixes in CI tests
- prefix name for aggregators
- various improvements in error messages
- issue of batch indices in the conditional random field
- issue of location selection in the weighted sampler
- model zoo: compatibility upgrade
- model zoo: new decathlon hippocampus dataset
Changed
- feature normalisation types options: instance norm, group norm, batch norm
- convolution with padding option
- various documentation and docstrings
- defaulting to remove length one dimensions when saving a 5D volume
0.5.0
Added
- Version controlled model zoo with git-lfs
- Dice + entropy loss function
- Antialiasing when randomly scaling input images during training
- Support of multiple optimisers and gradients in applications
Fixed
- An issue of rounding image sizes when
pixdim
is specified - An issue of incorrect Dice when image patch does not include every class
- Numerous documentation issues
Changed
- Tested with TensorFlow 1.12
0.4.0
Added
niftynet.layer
: new layers- Tversky loss function for image segmentation
- Random affine augmentation layer
- Random bias field augmentation layer
- Group normalisation layer
- Squeeze and excitation blocks
- Documentation
- Misc.
- Subject id from filename with regular expression
- Versioning with python-versioneer
- Tested with TensorFlow 1.10
Changed
niftynet.engine
: improved core functions- IO modules based on
tf.data.Dataset
(breaking changes) - Decoupled the engine and event handlers
- IO modules based on
- Migrated the code repository, model zoo, and niftynet.io source code to
github.com/niftk.
0.3.0
Added
- Support for 2D image loading optionally using
skimage
,pillow
, orsimpleitk
- Image reader and sampler with
tf.data.Dataset
- Class-balanced image window sampler
- Random deformation as data augmentation with SimpleITK
- Segmentation loss with dense labels (multi-channel binary labels)
- Experimental features:
- learning-based registration
- image classification
- model evaluation
- new engine design with observer pattern
Deprecated
0.2.2
0.2.1
Added
- Support for custom network / application as external modules
- Unified workspace directory via global configuration functionalities
- Model zoo for network / data sharing
- Automatic training / validation / test sets splitting
- Validation iterations during training
- Regression application
- 2D / 3D resampler layer
- Versioning functionality for better issue tracking
- Academic paper release: "NiftyNet: a deep-learning platform for medical imaging"
- How-to guides and a new theme for the API and examples documentation
0.2.0
Added
- Support for unsupervised learning networks, including GANs and auto-encoders
- An application engine for managing low-level operations required by different types of high-level applications
- NiftyNet is now available on the Python Package Index:
pip install niftynet
- NiftyNet website up and running: http://niftynet.io
- API reference published online: http://niftynet.rtfd.io/en/dev/py-modindex.html
- NiftyNet source code mirrored on GitHub: https://github.com/NifTK/NiftyNet
- 5 new network implementations:
- DenseVNet
- HolisticNet
- SimpleGAN
- SimulatorGAN
- VariationalAutoencoder (VAE)
Fixed
- Bugs (30+ issues resolved)