Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No grad_fn for "self.loss_G_chamfer2" #22

Open
gerwang opened this issue Sep 6, 2021 · 0 comments
Open

No grad_fn for "self.loss_G_chamfer2" #22

gerwang opened this issue Sep 6, 2021 · 0 comments

Comments

@gerwang
Copy link

gerwang commented Sep 6, 2021

Hello,
Thanks for your great work! I'm pretty interested in your distance transform loss. But when I want to test the loss separately, I find there is no grad_fn for self.loss_G_chamfer2

self.loss_G_chamfer2 = (dt1gt[(fake_B_gray<0)&(fake_B_gray_line1<0)].sum() + dt2gt[(fake_B_gray>=0)&(fake_B_gray_line2>=0)].sum()) / bs * self.opt.lambda_chamfer2

Since real_B_gray is input label and doesn't require grad, I believe self.loss_G_chamfer2 should have no grad_fn. So what is the purpose to calculate it? Or is it redundant?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant