Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

auto_optim enhancement #2169

Merged
merged 8 commits into from
Aug 23, 2021
Merged

Conversation

Chandan-h-509
Copy link
Contributor

@Chandan-h-509 Chandan-h-509 commented Aug 20, 2021

Fixes #2168

Description: Add kwargs to the auto_optim argument and pass it to DistributedOptimizer .

Check list:

  • New tests are added (if a new feature is added)
  • New doc strings: description and/or example code are in RST format
  • Documentation is updated (if required)

@github-actions github-actions bot added the module: distributed Distributed module label Aug 20, 2021
@sdesrozis
Copy link
Contributor

@Chandan-h-509 Thank you !

It would be nice to have a test of this new feature.

@Chandan-h-509
Copy link
Contributor Author

@sdesrozis I have added the test case.
Please review.
Thank you!

@sdesrozis
Copy link
Contributor

sdesrozis commented Aug 22, 2021

@Chandan-h-509 a minor comment to address and a format error to fix. See https://github.com/pytorch/ignite/pull/2169/checks?check_run_id=3389112372

Otherwise, it looks good 👍🏻

@Chandan-h-509
Copy link
Contributor Author

@sdesrozis I have made the final changes.
Please review.
Thank you.

Copy link
Contributor

@sdesrozis sdesrozis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Chandan-h-509 Thank you ! LGTM !

@vfdev-5
Copy link
Collaborator

vfdev-5 commented Aug 22, 2021

cc @sandylaker for visibility

@Chandan-h-509
Copy link
Contributor Author

@vfdev-5 I have made the new change as requested.
Kindly review.

@vfdev-5 vfdev-5 merged commit df6ba1d into pytorch:master Aug 23, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: distributed Distributed module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Feature Request: add an additional argument to auto_optim to allow for gradient accumulation
3 participants