Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About training code #4

Open
d12306 opened this issue Aug 27, 2019 · 4 comments
Open

About training code #4

d12306 opened this issue Aug 27, 2019 · 4 comments

Comments

@d12306
Copy link

d12306 commented Aug 27, 2019

@lukas-schott , thank you for your implementation, can you also show the training code in order to me to evaluate it on other datasets?

Thanks

@lukas-schott
Copy link
Member

As our training code is intertwined with other projects, we do not plan on releasing it.

However, it should be straight forward to train one VAE for each class. All relevant hyperparameters should be in the paper. Otherwise, I am happy to answer any questions regarding the training procedure.

@d12306
Copy link
Author

d12306 commented Aug 29, 2019

Thanks, I will try to train it and see what happens.

@JindongGu
Copy link

@lukas-schott Thanks for your awesome project and code! I am trying to train VAEs using the code. The loss function I used is ELBOs() in the abs/abs_models/loss_functions.py. However, the trained models failed to generate realistic digits. The loss function ELBOs() return elbo / (n_ch * nx * ny), which is not correct, in my opinion. There is no reason to divide elbo by the input dimensions. I appreciate your comments.

@lukas-schott
Copy link
Member

dividing the ELBO by the input dimension in
https://github.com/bethgelab/AnalysisBySynthesis/blob/master/abs_models/loss_functions.py#L37
is equivalent to adjusting the learning rate and can be seen as a hyperparameter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants