Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add broadcasting semantics to EmpiricalMarginal and TracePosterior #825

Open
eb8680 opened this issue Feb 27, 2018 · 7 comments
Open

Add broadcasting semantics to EmpiricalMarginal and TracePosterior #825

eb8680 opened this issue Feb 27, 2018 · 7 comments

Comments

@eb8680
Copy link
Member

eb8680 commented Feb 27, 2018

Along the lines of #791, we should allow global independence dimensions in Marginal and TracePosterior.

@eb8680 eb8680 self-assigned this Feb 27, 2018
@ngoodman
Copy link
Collaborator

great idea. (related: make importance parallel.)

this is possible only when the underlying computation is batchable. it might be sometimes hard to tell if this is so. is it possible to dynamically detect a failure of batching and bail out to the serial method?

@fritzo
Copy link
Member

fritzo commented Apr 22, 2018

Search was removed in #1044 . As I understand, Search would be reimplemented as part of this issue.

@fritzo
Copy link
Member

fritzo commented Jul 2, 2018

@eb8680 Does this issue also cover paralleization / batch-support for SVI? I'd like to implement batched inference to expose our EM solvers (BP + Newton).

@eb8680
Copy link
Member Author

eb8680 commented Jul 3, 2018

Does this issue also cover paralleization / batch-support for SVI?

What did you have in mind?

@fritzo
Copy link
Member

fritzo commented Jul 3, 2018

@eb8680 What did you have in mind?

I've written some details in #1213

@eb8680 eb8680 changed the title Parallelize computation of Marginal and Posterior Add broadcasting semantics to EmpiricalMarginal and TracePosterior Jul 13, 2018
@fritzo
Copy link
Member

fritzo commented Nov 13, 2018

@eb8680 is this still on track for our 0.3 release with PyTorch 1.0?

@eb8680 eb8680 removed this from the 0.3.0 release milestone Nov 14, 2018
@eb8680
Copy link
Member Author

eb8680 commented Nov 14, 2018

is this still on track for our 0.3 release with PyTorch 1.0?

We don't have an immediate use case, so probably not

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants