You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
For your paper under implementation details it says you use 900 scans from twindom and 600 scans from deephuman. Then it says you train for 9 epochs with a batch size of 3. If I'm not mistaken that would be 500 steps per batch right? You say that every 10,000 iterations you reduce the learning rate but Im not sure how you get above 4,500 steps?
Also from your code it seems you don't normalize the input image for the hourglass network. Is there any particular reason for that?
The text was updated successfully, but these errors were encountered:
For your paper under implementation details it says you use 900 scans from twindom and 600 scans from deephuman. Then it says you train for 9 epochs with a batch size of 3. If I'm not mistaken that would be 500 steps per batch right? You say that every 10,000 iterations you reduce the learning rate but Im not sure how you get above 4,500 steps?
Also from your code it seems you don't normalize the input image for the hourglass network. Is there any particular reason for that?
The text was updated successfully, but these errors were encountered: