Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add conditional inverse and compose TransformModules #3185

Merged
merged 10 commits into from
Mar 4, 2023
Merged

Conversation

eb8680
Copy link
Member

@eb8680 eb8680 commented Feb 23, 2023

This PR adds conditional analogues of ComposeTransformModule and InverseTransform to pyro.distributions.conditional. I've found these helpful in building up learnable conditional TransformedDistributions in Pyro models.

I also fixed a couple of small bugs in ComposeTransformModule that affected interoperability of Transforms and TransformModules.

Tested:

  • Unit tests for ConditionalComposeTransformModule and _ConditionalInverseTransformModule
  • Tests for new transforms also exercise bugfixes

@eb8680 eb8680 requested a review from fritzo February 23, 2023 01:32
@eb8680 eb8680 requested review from fehiepsi and removed request for fritzo March 2, 2023 17:09
fehiepsi
fehiepsi previously approved these changes Mar 3, 2023
Copy link
Member

@fehiepsi fehiepsi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM pending the cache_size check.


def __hash__(self):
return super(torch.nn.Module, self).__hash__()

def with_cache(self, cache_size=1):
if cache_size == 0:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess this is cache_size == self._cache_size?

@fehiepsi fehiepsi merged commit 9afb089 into dev Mar 4, 2023
@fehiepsi
Copy link
Member

fehiepsi commented Mar 4, 2023

Thanks, Eli!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants