Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implementing SGD in Minima #4

Closed
bhavnicksm opened this issue Feb 7, 2023 · 0 comments
Closed

Implementing SGD in Minima #4

bhavnicksm opened this issue Feb 7, 2023 · 0 comments

Comments

@bhavnicksm
Copy link
Contributor

bhavnicksm commented Feb 7, 2023

Introduction

We need to write our own SGD implementation from Pytorch. This issue is to keep all the resources for writing optimizers in one place in PyTorch.

We want to create a BaseOptimiser, which would have general commands in the future and can be inherited for every optimizer. For e.g. instead of momentum, we wish to enable Nesterov acceleration, enabling nesterov=True.

Resources

@apzl apzl mentioned this issue Feb 7, 2023
@apzl apzl closed this as completed Feb 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants