Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert to old default scale constraint in autoguides #2774

Merged
merged 2 commits into from
Mar 4, 2021
Merged

Conversation

fritzo
Copy link
Member

@fritzo fritzo commented Mar 4, 2021

This reverts autoguide scale constraints from #2767 to the old exp-transformed versions until @vitkl can perform experiments confirming improved performance. If experiments succeed, we can simply revert this PR. Note this PR preserves the ability to easily perform those experiments, via

old_autonormal = AutoNormal(model)
old_autonormal.scale_constraint = constraints.positive

new_autonormal = AutoNormal(model)
new_autonormal.scale_constraint = constraints.softplus_positive

See also pyro-ppl/numpyro#941

@fehiepsi
Copy link
Member

fehiepsi commented Mar 4, 2021

I like this decision. :)

@vitkl
Copy link
Contributor

vitkl commented Mar 4, 2021

I like the decision - but did any issues with this come up?

@fritzo
Copy link
Member Author

fritzo commented Mar 4, 2021

did any issues with this come up?

yes, some inference tests started failing in #2767 and pyro-ppl/numpyro#941. To get those tests to pass in #2767 I tweaked some learning parameters. those failures are not strong evidence for either constraint, but in the absence of strong evidence, we should preserve existing behavior.

@fritzo
Copy link
Member Author

fritzo commented Mar 4, 2021

@fehiepsi could you merge this?

@fritzo fritzo mentioned this pull request Mar 4, 2021
@fehiepsi fehiepsi merged commit 26286aa into dev Mar 4, 2021
@fritzo fritzo deleted the revert-softplus branch September 27, 2021 14:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants