allow for batched inference with observational SIR model, add test for batched inference #566
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The sir observational model wasn't general enough to guide the user in building models for inference with batched data. It made a gesture in this direction by
allowing:
But if a model of this type is passed to SVI inference, the log prob shapes are wrong. To ensure their correctness, we also need to introduce a plate of appropriate shape.
To illustrate and to ensure proper functionality, I added
dynamical/test_batched_inference.py
. For illustration, commenting out the lines introducing the plate in that test model (and un-indenting thepyro.sample
statements) will lead to the type of log prob shape error in question.Accordingly, I revised the dynamical systems notebook. The sir observational model now is:
Otherwise, small changes, including adding plot.show() and partial prediction parallelization:
to
to:
(added
plt.show()
at a few locations to avoid redundant printing of object names before plotting)to
and
to
There seems to be a small shape-related bug in the notebook that leads to runtime error with parallelizaton at a few locations. It remains unfixed. The locations are:
and
and
The whole notebook has been re-run.