-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added return_grad for all types of rnnt loss #29
Conversation
Can you also fix this issue (#25 (comment)), thanks! |
b48d47e
to
46eb2b0
Compare
I fixed #25 (comment) Summary of the changes:
|
@durson I have added the github actions to this repo, could you sync the PR with the latest master, thanks! |
Merged latest master to fork and tests are passing. |
Thanks! merging! I would be very nice if you can also make a PR to https://github.com/k2-fsa/k2 (k2/python/k2/rnnt_loss.py). |
Grad tensors can be useful for the full (ordinary and pruned) rnnt losses.