Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature spline regularization #1222

Merged
merged 24 commits into from
Dec 7, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
24 commits
Select commit Hold shift + click to select a range
96dc80a
Initial, missing regularization grad
Doresic Jul 13, 2023
67b09bc
Merge branch 'develop' into feature_spline_regularization
Doresic Jul 13, 2023
5a6e435
Merge branch 'develop' into feature_spline_regularization
Doresic Jul 13, 2023
abf0fcc
Complete gradient
Doresic Jul 14, 2023
962875d
Fix par plot, add reg to spline plot
Doresic Jul 28, 2023
537b53b
Some random changes TODO
Doresic Oct 13, 2023
960f23c
Fix par_sim_idx in spline solver
Doresic Nov 29, 2023
729542b
Fix par_sim_index, pass edata_indices
Doresic Nov 29, 2023
947c457
Remove ds_dtheta_term calculation
Doresic Nov 29, 2023
c3661da
Merge branch 'develop' into feature_spline_regularization
Doresic Nov 29, 2023
375a2ab
Remove redundancy
Doresic Nov 29, 2023
f737b07
Small cleanup
Doresic Nov 29, 2023
831d8b2
Remove a mistake
Doresic Nov 29, 2023
f53720d
Add regularization test
Doresic Nov 29, 2023
e0a5772
Quality test fix
Doresic Nov 29, 2023
9b3afa3
Spline tests fix + obj fun fix
Doresic Nov 30, 2023
7dd6188
Notebook update
Doresic Nov 30, 2023
99d0ab7
Daniel review changes
Doresic Nov 30, 2023
b927d59
Merge branch 'develop' into feature_spline_regularization
stephanmg Dec 1, 2023
0cc1ad5
Dilan&Fabian review changes
Doresic Dec 4, 2023
092de9b
Improve test coverage
Doresic Dec 4, 2023
7edc504
Merge branch 'develop' into feature_spline_regularization
Doresic Dec 6, 2023
d394f8f
Merge branch 'develop' into feature_spline_regularization
Doresic Dec 6, 2023
03e6ef0
Merge branch 'develop' into feature_spline_regularization
Doresic Dec 7, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
475 changes: 432 additions & 43 deletions doc/example/nonlinear_monotone.ipynb

Large diffs are not rendered by default.

9 changes: 8 additions & 1 deletion pypesto/C.py
Original file line number Diff line number Diff line change
Expand Up @@ -194,7 +194,14 @@ class InnerParameterType(str, Enum):

SPLINE_RATIO = 'spline_ratio'
MIN_DIFF_FACTOR = 'min_diff_factor'
SPLINE_APPROXIMATION_OPTIONS = [SPLINE_RATIO, MIN_DIFF_FACTOR]
REGULARIZE_SPLINE = 'regularize_spline'
REGULARIZATION_FACTOR = 'regularization_factor'
SPLINE_APPROXIMATION_OPTIONS = [
SPLINE_RATIO,
MIN_DIFF_FACTOR,
REGULARIZE_SPLINE,
REGULARIZATION_FACTOR,
]

MIN_SIM_RANGE = 1e-16

Expand Down
3 changes: 2 additions & 1 deletion pypesto/ensemble/ensemble.py
Original file line number Diff line number Diff line change
Expand Up @@ -853,12 +853,13 @@ def _map_parameters_by_objective(
"""
# create short hands
parameter_ids_objective = predictor.amici_objective.x_names
parameter_ids_ensemble = self.x_names
parameter_ids_ensemble = list(self.x_names)
# map, and fill with `default_value` if not found and `default_value`
# is specified.
mapping = []
for parameter_id_objective in parameter_ids_objective:
if parameter_id_objective in parameter_ids_ensemble:
# Append index of parameter in ensemble.
mapping.append(
parameter_ids_ensemble.index(parameter_id_objective)
)
Expand Down
29 changes: 22 additions & 7 deletions pypesto/hierarchical/inner_calculator_collector.py
Original file line number Diff line number Diff line change
Expand Up @@ -496,18 +496,33 @@ def calculate_quantitative_result(
]
)
# calculate the gradient for the condition
gradient_for_condition = ssigma_i @ (
(
np.full(len(data_i), 1)
- (data_i - sim_i) ** 2 / sigma_i**2
)
/ sigma_i
) - sensitivities_i @ ((data_i - sim_i) / sigma_i**2)
gradient_for_condition = np.nansum(
np.multiply(
ssigma_i,
(
(
np.full(len(data_i), 1)
- (data_i - sim_i) ** 2 / sigma_i**2
)
/ sigma_i
),
),
axis=1,
) + np.nansum(
np.multiply(
sensitivities_i, ((sim_i - data_i) / sigma_i**2)
),
axis=1,
)

# add gradient to correct index of snllh
for par_sim, par_opt in condition_map_sim_var.items():
if not isinstance(par_opt, str):
continue

if par_opt not in par_opt_ids:
continue

par_opt_idx = par_opt_ids.index(par_opt)
par_sim_idx = par_sim_ids.index(par_sim)
par_edata_idx = (
Expand Down
1 change: 1 addition & 0 deletions pypesto/hierarchical/spline_approximation/calculator.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,6 +228,7 @@ def __call__(
parameter_mapping=parameter_mapping,
par_opt_ids=x_ids,
par_sim_ids=amici_model.getParameterIds(),
par_edatas_indices=[edata.plist for edata in edatas],
snllh=snllh,
)

Expand Down
Loading