Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix tensor initialization in Optimizers Python wrappers #4975

Closed
abhinavarora opened this issue Oct 20, 2017 · 0 comments · Fixed by #5275
Closed

Fix tensor initialization in Optimizers Python wrappers #4975

abhinavarora opened this issue Oct 20, 2017 · 0 comments · Fixed by #5275
Assignees

Comments

@abhinavarora
Copy link
Contributor

All optimizers require the one-time initialization of some tensors like learning rate, accumulators etc. Currently we initialize them using fill_constant op, which is not correct. Once the new initialization design (#4852) is implemented, we need to change the initialization in optimizers.py.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants