Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix weight sharing #2866

Merged
merged 3 commits into from
Aug 7, 2015
Merged

Fix weight sharing #2866

merged 3 commits into from
Aug 7, 2015

Commits on Aug 7, 2015

  1. Configuration menu
    Copy the full SHA
    f81ed07 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    c251b25 View commit details
    Browse the repository at this point in the history
  3. Net: add learnable_params_ used by solvers to correctly handle shared…

    … params
    
    -Params now share diffs as well as data (works due to layers
    accumulating gradients into param diffs, rather than overwriting)
    -It's now required that any shared params with specified lr_mult's,
    decay_mult's match
    -TestGradientBasedSolver checks that behavior remains correct with
    shared weights
    jeffdonahue committed Aug 7, 2015
    Configuration menu
    Copy the full SHA
    d5b42bf View commit details
    Browse the repository at this point in the history