Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

does pytorch1.0 support Synchronized BatchNorm? #34

Closed
mmxuan18 opened this issue Oct 26, 2018 · 2 comments
Closed

does pytorch1.0 support Synchronized BatchNorm? #34

mmxuan18 opened this issue Oct 26, 2018 · 2 comments
Labels
enhancement New feature or request

Comments

@mmxuan18
Copy link

🚀 Feature

does pytorch1.0 support Synchronized BatchNorm, and in this code the FrozenBatchNorm has the same function as Synchronized BN?

why use FrozenBatchNorm, as same thirdparty faster rcnn implement's backbone resnet do'nt has this feature

@fmassa
Copy link
Contributor

fmassa commented Oct 26, 2018

Hi,

PyTorch 1.0 currently doesn't support Synchronized Batch Norm, but there are discussions on how to support it, see for example pytorch/pytorch#2584 and pytorch/pytorch#12198

Because the discussion on how to support Synchronized Batch Norm is still ongoing, we decided to follow the Detectron implementation of freezing batch norm statistics during training so that we don't have issues when training with small batch sizes.
A possible solution for now would be to train using GroupNorm, which makes training with small batches possible

@soumith
Copy link
Member

soumith commented Apr 7, 2019

Added now in PyTorch master via pytorch/pytorch#14267
For documentation, see: https://pytorch.org/docs/master/nn.html#torch.nn.SyncBatchNorm

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants