You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.
When making inferences with torch.amp.autocast(), the forward results show significant numerical differences compared with pretrained resnext101_32x8d from torchvision as the sample outputs in the following given the same input batch:
Output from WSL pretrained resnext101_32x8d_wsl shows significant differences:
Is it because the pretrained resnext101 from torchvision is already trained in mixed precision or something else?
Any clarifications would be appreciated.
PS: sample pytest code to load the models and run the tests:
The text was updated successfully, but these errors were encountered:
farleylai
changed the title
Significant numerical differences with torch.amp.autocast() compared with stock resnext101?
Significant numerical differences with torch.amp.autocast() compared with stock pretrained resnext101
Nov 6, 2020
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
When making inferences
with torch.amp.autocast()
, the forward results show significant numerical differences compared with pretrainedresnext101_32x8d
fromtorchvision
as the sample outputs in the following given the same input batch:Output from
WSL
pretrainedresnext101_32x8d_wsl
shows significant differences:Output from
torchvision
pretrainedresnext101_32x8d
shows approximate numerical values:Is it because the pretrained resnext101 from torchvision is already trained in mixed precision or something else?
Any clarifications would be appreciated.
PS: sample
pytest
code to load the models and run the tests:The text was updated successfully, but these errors were encountered: