Updates to VAE (and ops) to enable variable batch sizes #26
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This is a great collection of generative models for TensorFlow, all nicely wrapped in a common class interface. I'd like to use this as a basis for ongoing work I'm migrating to TensorFlow. I'm interested in using this code not only to test MNIST models, but also as a way of generating a series of reference models using several other datasets which can be reused and shared.
So as a first proposed change, I'd like to separate the batch_size from the model definition to instead be a runtime variable by using a placeholder. This allows:
I've done a quick version of this for the VAE model and verified that this still works on that model (at least on the latest TensorFlow) and enables (1) and (2) above. If you are open to the spirit of this change, I'm happy to rework the implementation if you'd like this cleaned up further.