-
Notifications
You must be signed in to change notification settings - Fork 356
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Independent ask and tell #263
Comments
Something like the following should work: for param, value in somehow_before_obtained_results:
candidate = optimizer.create_candidate.from_call(*params)
# may fail for some optimizers who do not support tell without ask (but most are OK)
optimizer.tell(candidate, value)
for _ in range(optimizer.budget):
x = optimizer.ask()
value = square(*x.args, **x.kwargs)
optimizer.tell(x, value)
recommendation = optimizer.provide_recommendation() I suppose your question mostly concerned the |
Well, I actually send =). Thanks a lot! |
All BO based algorithms in nevergrad use BayesianOptimization under the hood, so you will probably have the same problem if you use them, but most other optimizers should be fine. |
So, I've updated nevergrad to 0.4.0. |
Hi,
Does that answer your question? |
Thanks a lot, now it's clear. As I could guess from changelog, there is some change in points storage mechanic in recent times also: #571. If this is right, there is one more uncertain part.
Could you please say am I right or not for the following statements:
|
Yes, but not all optimizers support tells that where not asked. (I'll provide another option at the end)
Somehow: some optimizers (like most standard variants of
Indeed, we wont keep track of them. Some non-standard variants of
Indeed, they are dropped for most optimizers. However now that you mention it, Another option to consider: setup your parametrization to have the initialization you want. With the new parametrization you can set the Does that clarify it for you? If you see specific areas of the documentation I could clarify don't hesitate to point it out. I don't know how to integrate all these information in a clear way :s |
A lot of useful info, thank you!
I'm doing it in the following manner:
Do you advice me to do
After? It throws
I found your last answer #263 (comment) is a candidate to be completely transferred to https://facebookresearch.github.io/nevergrad/optimization.html#telling-non-asked-points-or-suggesting-points |
Parameters are "frozen" as soon as they interact with the optimizer, to avoid side effects. Setting the If you have provided init values then you are all fine though. Note that by default, when you set lower and upper bounds, sigma (= the step size) is set to 1/6th of the full range. You may want to update it if you have a better prior. |
Steps to reproduce
Relevant Code
What do I want could be coded like:
The text was updated successfully, but these errors were encountered: