Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow 2-parameter lambda_sampling #19

Merged
merged 3 commits into from
Feb 23, 2023
Merged

Allow 2-parameter lambda_sampling #19

merged 3 commits into from
Feb 23, 2023

Conversation

chinyitan
Copy link
Contributor

No description provided.

Copy link
Owner

@sibirrer sibirrer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great @chinyitan, thank you very much!
Please add tests that cover your additions in the parameter handling, and in the feature that you introduced.
Also, you can add your name to the contributors ;)

@@ -45,6 +46,8 @@ def __init__(self, kwargs_likelihood_list, cosmology, kwargs_bounds, sne_likelih
:param lambda_ifu_distribution: string, distribution function of the lambda_ifu parameter
:param alpha_lambda_sampling: bool, if True samples a parameter alpha_lambda, which scales lambda_mst linearly
according to a predefined quantity of the lens
:param beta_lambda_sampling: bool, if True samples a parameter beta_lambda, which scales lambda_mst linearly
according to a predefined quantity of the lens
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you describe a bit in more detail how and where this 'pre-defined quantity' gets described?

@@ -199,7 +203,7 @@ def check_dist(self, kwargs_lens, kwargs_kin, kwargs_source):
return False

def draw_lens(self, lambda_mst=1, lambda_mst_sigma=0, kappa_ext=0, kappa_ext_sigma=0, gamma_ppn=1, lambda_ifu=1,
lambda_ifu_sigma=0, alpha_lambda=0):
lambda_ifu_sigma=0, alpha_lambda=0, beta_lambda=0):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add documentation for beta_lambda

@@ -89,6 +92,12 @@ def param_list(self, latex_style=False):
list.append(r'$\alpha_{\lambda}$')
else:
list.append('alpha_lambda')
if self._beta_lambda_sampling is True:
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add test function

@@ -146,6 +155,12 @@ def args2kwargs(self, args, i=0):
else:
kwargs['alpha_lambda'] = args[i]
i += 1
if self._beta_lambda_sampling is True:
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add test function

@@ -182,4 +197,7 @@ def kwargs2args(self, kwargs):
if self._alpha_lambda_sampling is True:
if 'alpha_lambda' not in self._kwargs_fixed:
args.append(kwargs['alpha_lambda'])
if self._beta_lambda_sampling is True:
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add test function

Copy link
Owner

@sibirrer sibirrer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @chinyitan !

@sibirrer sibirrer merged commit 2761c18 into sibirrer:main Feb 23, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants