Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Embedding layer need to support Adam opimizer update, How to? #7019

Closed
typhoonzero opened this issue Dec 26, 2017 · 1 comment
Closed

Embedding layer need to support Adam opimizer update, How to? #7019

typhoonzero opened this issue Dec 26, 2017 · 1 comment
Assignees

Comments

@typhoonzero
Copy link
Contributor

No description provided.

@typhoonzero typhoonzero changed the title Adam op need to support selected rows for sparse update. Embedding layer need to support Adam opimizer update, How to? Dec 26, 2017
@typhoonzero
Copy link
Contributor Author

Basic functors seems needed in paddle for "lookup" based networks, see: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/docs_src/api_guides/python/state_ops.md#sparse-variable-updates

Also Adam optimizer can also use sparse gradient update like: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/training/adam.py#L174

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant