Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CIFAR10 accuracy #12

Open
MrChenFeng opened this issue Jul 17, 2020 · 3 comments
Open

CIFAR10 accuracy #12

MrChenFeng opened this issue Jul 17, 2020 · 3 comments

Comments

@MrChenFeng
Copy link

Hi Spijkervet,@Spijkervet

have you tried more epochs for your experiment settings, like 500 epochs,
the result from the paper has shown that even 256 batchsize could achieve about 93% accuracy for 500 epochs traning.
However, my implementation can only got ~89.5% accuracy even with 512 batchsize.

Best,
Chen

@gauenk
Copy link
Contributor

gauenk commented Oct 2, 2020

I also don't get the reported accuracy. Even with the code and pretrained weights from @Spijkervet I do not get the reported test accuracy claimed in this repo: only 75% compared to the reported 83% for CIFAR10. I wonder if the lars optimizer has something to do with it. I have really tried to get it working but I end up with "nan" values for my training loss.

@LanXiaoPang613
Copy link

Hi chen, if you used the code which is shared by Spijkervet? Why i only achieved nearly 65% acc on CIFAR-10. It seems that used 128 batch size train the representation may lead to acc decrease.

@MrChenFeng
Copy link
Author

Hi chen, if you used the code which is shared by Spijkervet? Why i only achieved nearly 65% acc on CIFAR-10. It seems that used 128 batch size train the representation may lead to acc decrease.

I havent tried Spijkervet implementation. In my implementations, epoch num is more important compared to batchsize. However, I cannot get better result than 90% with resnet18 in all settings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants