-
Notifications
You must be signed in to change notification settings - Fork 60
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update acquisition cost implementation #479
Changes from 3 commits
09e98ab
686ee8a
0db4581
f444983
559a517
07becc7
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -111,28 +111,28 @@ def minimize(fun, | |
return locs[ind_min], vals[ind_min] | ||
|
||
|
||
class CostFunction: | ||
"""Convenience class for modelling acquisition costs.""" | ||
class AdjustmentFunction: | ||
"""Convenience class for modelling acquisition function adjustments.""" | ||
|
||
def __init__(self, function, gradient, scale=1): | ||
"""Initialise CostFunction. | ||
"""Initialise AdjustmentFunction. | ||
|
||
Parameters | ||
---------- | ||
function : callable | ||
Function that returns cost function value. | ||
Function that returns adjustment function value. | ||
gradient : callable | ||
Function that returns cost function gradient. | ||
Function that returns adjustment function gradient. | ||
scale : float, optional | ||
Cost function is multiplied with scale. | ||
Adjustment function is multiplied with scale. | ||
|
||
""" | ||
self.function = function | ||
self.gradient = gradient | ||
self.scale = scale | ||
|
||
def evaluate(self, x): | ||
"""Return cost function value evaluated at x. | ||
"""Return adjustment function value evaluated at x. | ||
|
||
Parameters | ||
---------- | ||
|
@@ -148,7 +148,7 @@ def evaluate(self, x): | |
return self.scale * self.function(x).reshape(n, 1) | ||
|
||
def evaluate_gradient(self, x): | ||
"""Return cost function gradient evaluated at x. | ||
"""Return adjustment function gradient evaluated at x. | ||
|
||
Parameters | ||
---------- | ||
|
@@ -162,3 +162,67 @@ def evaluate_gradient(self, x): | |
x = np.atleast_2d(x) | ||
n, input_dim = x.shape | ||
return self.scale * self.gradient(x).reshape(n, input_dim) | ||
|
||
|
||
def make_additive_acq(acquisition_class, function): | ||
"""Make acquisition function adjusted with an additive term. | ||
|
||
Parameters | ||
---------- | ||
acquisition_class : Type[elfi.methods.bo.acquisition.AcquisitionBase] | ||
Acquisition function to be adjusted. | ||
function : AdjustmentFunction | ||
Function added to the base acquisition function. | ||
|
||
Returns | ||
------- | ||
AdditiveAcquisition | ||
uremes marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
""" | ||
class AdditiveAcquisition(acquisition_class): | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why put There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
if your comment was about the new acquisition class name, i can rename both if the comment was about the approach in general, the reason i used dynamic inheritance is that our available acquisition classes have varied implementations for the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Fair enough, it's an implementation that works for this case. We just have to be careful where and when it's being called. I guess it generates a new class every time it's being called? Even with the same There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yes, new class every time. i understand that creates some overhead compared to predetermined classes, but i assumed that's fine since we don't call this method more than once per inference task. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes! Just have to make sure that users won't try out silly things with it :) |
||
|
||
def __init__(self, model, **kwargs): | ||
super().__init__(model=model, **kwargs) | ||
self._func = function | ||
|
||
def evaluate(self, theta_new, t=None): | ||
return super().evaluate(theta_new, t=t) + self._func.evaluate(theta_new) | ||
|
||
def evaluate_gradient(self, theta_new, t=None): | ||
t1 = super().evaluate_gradient(theta_new, t=t) | ||
t2 = self._func.evaluate_gradient(theta_new) | ||
return t1 + t2 | ||
|
||
return AdditiveAcquisition | ||
|
||
|
||
def make_multiplicative_acq(acquisition_class, function): | ||
"""Make acquisition function adjusted with a multiplictive term. | ||
|
||
Parameters | ||
---------- | ||
acquisition_class : Type[elfi.methods.bo.acquisition.AcquisitionBase] | ||
Acquisition function to be adjusted. | ||
function : AdjustmentFunction | ||
Function that multiplies the base acquisition function. | ||
|
||
Returns | ||
------- | ||
MultiplicativeAcquisition | ||
|
||
""" | ||
class MultiplicativeAcquisition(acquisition_class): | ||
uremes marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
def __init__(self, model, **kwargs): | ||
super().__init__(model=model, **kwargs) | ||
self._func = function | ||
|
||
def evaluate(self, theta_new, t=None): | ||
return super().evaluate(theta_new, t=t) * self._func.evaluate(theta_new) | ||
|
||
def evaluate_gradient(self, theta_new, t=None): | ||
t1 = super().evaluate_gradient(theta_new, t=t) * self._func.evaluate(theta_new) | ||
t2 = super().evaluate(theta_new, t=t) * self._func.evaluate_gradient(theta_new) | ||
return t1 + t2 | ||
|
||
return MultiplicativeAcquisition |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why have
AdjustmentFunction
class and then define the maker functions later on that return classes defined with the scope of the maker functions?There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the maker functions were needed to make new acquisition classes that inherit from the base acquisition class that is given as input. the functions currently take as input the base acquisition class and an
AdjustmentFunction
instance, but i could also remove this convenience class and havemake_additive_acq
andmake_multiplicative_acq
take as input the callable that we want to use as additive or multiplicative term and the callable that returns its gradient. is that what you meant?