Skip to content

How can I use training parameter from one custom layer in another custom layer? #941

Answered by avik-pal
AzamatB asked this question in Q&A
Discussion options

You must be logged in to vote

Yes this would work if you are okay with having importance_weights in the decoder params. For the share_parameters to work correctly, you might want to initialize the decoder with importance_weights and later link thme as you did.

One pointer to make your debugging easier (if needed), construct the Chain as Chain(; importance_scaling=ImportanceScaling(10), decoder=Decoder(10, 10)) then the sharing becomes Lux.Experimental.share_parameters(ps, (("importance_scaling.importance_weights", "decoder.importance_weights"),))

Replies: 1 comment 4 replies

Comment options

You must be logged in to vote
4 replies
@AzamatB
Comment options

@avik-pal
Comment options

Answer selected by AzamatB
@AzamatB
Comment options

@avik-pal
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants