-
-
Notifications
You must be signed in to change notification settings - Fork 985
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add broadcasting semantics to EmpiricalMarginal and TracePosterior #825
Comments
great idea. (related: make importance parallel.) this is possible only when the underlying computation is batchable. it might be sometimes hard to tell if this is so. is it possible to dynamically detect a failure of batching and bail out to the serial method? |
|
@eb8680 Does this issue also cover paralleization / batch-support for |
What did you have in mind? |
@eb8680 is this still on track for our 0.3 release with PyTorch 1.0? |
We don't have an immediate use case, so probably not |
Along the lines of #791, we should allow global independence dimensions in
Marginal
andTracePosterior
.The text was updated successfully, but these errors were encountered: