Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PiBO #813

Merged
merged 51 commits into from
Apr 7, 2022
Merged

PiBO #813

merged 51 commits into from
Apr 7, 2022

Conversation

hvarfner
Copy link
Contributor

PiBO - BO with user Priors over the Optimum

New attempt on PiBO - here are the changes from last time:

New Added unit tests for PriorAcquisitionFunction (and required supporting classes)

  • Tests for the _compute_prior for different shapes
  • Tests for standard computation for both EI (does not require rescaling) and TS (does require rescaling)
  • Tests for discretizing the prior (relevant for random forests

Added unit test for LocalAndSortedPriorRandomSearch

  • Since this one is pretty much tested (it just appends two different random searches to one another), there is really only one thing to test to me, but I may be wrong

Removed the need to sample the prior to get the co-domain of the prior

  • Moved to closed-form in ConfigSpace

Changed input kwargs in facade

  • Now called user_prior_kwargs (never got acquisition_function_kwargs to work, since the underlying acquisition function gets arguments it does not expect, like decay_beta)

Confirmed issue combining inactive with Normal / Beta hyperparameters

  • Due to what I believe is some old TPE support, Normal parameters are not normalized to 0-1 range (presumably since they can be used without upper and lower bounds
  • Inactive parameters are flagged with a -1 value (right?) for Uniform parameters, which does not mesh with the possibly unbounded nature of Normal parameters
  • As such, one could force Normal parameters to be bounded, normalize, and allow for PiBO with inactive parameters, but there may be some other issue regaring this that I fail to see. Simply put, I don't know if it's OK to forcibly bound and normalize Normal parameters in ConfigSpace.

Added beauitified example usage

  • May not be the most intuitive, but at least it showcases how to use it

And of course, right branch this time

Distributions for GPs. Added:
- Method get_configspace in base_epm (needed in PriorAcquisitionFunction)
- Allow for NormalInteger- and NormalFloatHyperparameter in util_funcs
check
- Added support for enabling user priors in optimization in facade
- Added PriorAcquisitionFunction class in acquisition.py
incorporated it into Local Search with priors
with random forests and continous parameters
in ConfigSpace (Hyperparameter.get_max_density())
search space based on the prior (not only the uniform one).
@benjamc
Copy link
Contributor

benjamc commented Mar 24, 2022

We need to wait for the new ConfigSpace release and then see if the tests are running through. After that, we can merge. 🌞

@eddiebergman
Copy link
Contributor

I'll release a new ConfigSpace later today :)

@hvarfner
Copy link
Contributor Author

Great, thanks!

@eddiebergman
Copy link
Contributor

You can check progress here, I bumped the version notes and now the wheels are building for Windows/Linux/Mac ... Once those are done I can publish a release :)

@eddiebergman
Copy link
Contributor

One more point: Considering it's a fairly major change to ConfigSpace, I bumped from 0.4.21 -> 0.5.0. This means the requirements.txt will need to be updated. I did this change as it will also effect many other libraries that rely on ConfigSpace, it should hopefully prevent anyone being safe and relying on the 0.4.x from experiencing sudden issues.

@eddiebergman
Copy link
Contributor

eddiebergman commented Mar 24, 2022

@hvarfner, @benjamc It's live, https://pypi.org/project/ConfigSpace/#files :)

Just going to reiterate the requirements file needs to be changed here.

@eddiebergman
Copy link
Contributor

Sorry more notes, the dependency cycle continues. Autosklearn can't use new ConfigSpace until there's a new SMAC release. It's not urgent but just that I will be nudging this along while possible.

@hvarfner
Copy link
Contributor Author

hvarfner commented Apr 3, 2022

Hi,

If I'm not mistaking, we waited for configspace to be in place before, right? Is there more that needs to be done? If so, please let me know what I can do!

@eddiebergman
Copy link
Contributor

Hi @hvarfner,

As previously mentioned, the requirements file needs to be updated so that the right version of ConfigSpace gets pulled in. If you look at the test errors, you can see that ConfigSpace has no variable BetaFloatHyperparameter and other such errors. Hopefully there'll be no bugs to fix with that but we'll have to see :)

Best,
Eddie

@renesass
Copy link
Collaborator

renesass commented Apr 4, 2022

We have another PR going on right now, which updates the files based on updated tools (black, flake8, ...). We would love to have PiBO in this "cleaned" state, although it takes a bit more time. I can't promise it but most likely we can merge it Thursday.

Sorry for the late responses - a lot going on recently.

@mlindauer
Copy link
Contributor

Hi Rene and Eddie,
ICLR is in a few weeks and of course, we want to advertise piBO as part of SMAC (and Hypermapper). So please make sure that this is done before the ICLR starts -- earlier would be even better because people have watched Carl's yt video already.

model_instance, RFRImputator
)
acquisition_function_instance = PriorAcquisitionFunction(
acquisition_function=acquisition_function_instance, **user_prior_kwargs, **acq_def_kwargs
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both decay_beta and discretize are not used. Should they be part of "user_prior_kwargs"?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After this issue is fixed, we can merge it. :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great to hear! I'll check it right away.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Addressed the issue now, thanks for spotting.

@renesass renesass changed the title Pibo dev PiBO Apr 7, 2022
@renesass renesass merged commit e0e892f into automl:development Apr 7, 2022
@renesass renesass mentioned this pull request Apr 7, 2022
renesass added a commit that referenced this pull request Apr 19, 2022
- PiBO: Augment the acquisition function by multiplying by a pdf given by the user.
The prior then decays over time, allowing for the optimization to carry on as per default.
- The `RunHistory` can now act as a `Mapping` in that you can use the usual methods you
can use on dicts, i.e. `len(rh)`, `rh.items()`, `rh[key]`. Previously this was usually done by
accessing `rh.data` which is still possible.
- Updated the signature of the `ROAR` facade to match with it's parent class `SMAC4AC`.
Anyone relying on the output directory without specifying an explicit `run_id` to a `ROAR`
facade should now expect to see the output directory at `run_0` instead of `run_1`. See #827.
- Updated and integrated flake8, mypy, black, and isort.
- SMAC uses `automl_sphinx_theme` now and therefore the API is displayed nicer.

Co-authored-by: Carl Hvarfner <58733990+hvarfner@users.noreply.github.com>
Co-authored-by: Difan Deng <deng@dengster.tnt.uni-hannover.de>
Co-authored-by: Carolin Benjamins <benjamins@tnt.uni-hannover.de>
Co-authored-by: Tim Ruhkopf <timruhkopf@gmail.com>
Co-authored-by: eddiebergman <eddiebergmanhs@gmail.com>
github-actions bot pushed a commit that referenced this pull request Apr 19, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants