-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Behavior when not specifying the blobs_lr in a layer #100
Comments
I think if the blobs_lr is 0, one can also set force_backward (the member of NetParameter) as true to operate back propagation in the training net. |
@tdomhan That is indeed a bad design. Will change default to 1. |
Would that affect the Test phase? Or the finetuning? Sergio
|
My bad, fixing things in the afternoon droozy mode apparently is not the best. Just realized that the current fix has one caveat: when a layer has no trainable parameters, it will always trigger need_backwards. We will need to add functionality to also check if the layer has parameters to train. Repoened and I will fix later. |
Perhaps we should have a |
Fixed the bug in #103. @sguada Testing phase will not be affected since backward won't be actually carried out. Finetuning will not be affected (if we set blobs_lr to be 0, we will still not do backpropagation, as intended). I am going to abuse my power a little bit and simply merge that pull request... |
bugfix regarding BVLC#100
Fixed paths of CommonSetttings.props in *.vcxproj
Related to the issue [Attempt to free invalid pointer. Abort trap: 6 BVLC#100](yosinski/deep-visualization-toolbox#100)
Is it intended, that the default behavior, when not specifying the blobs_lr in a layer, is to not use backpropagation on this layer?
I just spend a couple of hours trying to figure out why my network wasn't working until I realized that this was the cause.
I personally think this is a very dangerous behavior and the default should be to set the learning rate multiplier for this blob to 1. Only if someone explicitly sets the blobs_lr to 0, backpropagation should be deactivated.
The text was updated successfully, but these errors were encountered: